Showing posts with label linux. Show all posts
Showing posts with label linux. Show all posts

Reading Notes #348

IMG_20181016_073145

Cloud


Programming


Data


Miscellaneous


~enjoy!

What happens when you mix Asp.Net Core, different versions of Docker and Azure for the first time

For a project I have, I wanted to validate if containers were easier to use compare to regular code with services in Azure. I needed to refresh myself with Docker, so I decide to do what I thought would be a simple test: Create an Asp.Net Core web site in a container and access it on my machine.

This post is about my journey to finally achieve this goal, as you may guess it didn't work on the first attempt.

The Goal


One reason why I was looking at containers, it's because it's supposed to be working everywhere right? Well yes but sometimes with a little of effort. The goal here is to be able to run the same container on my main PC, my surface, a Linux VM and of course in Azure.

The context


I have a different setup on my main machine and on my surface. On my PC, I'm using VirtualBox for my VMs so I'm not running Docker for windows, but Docker Toolbox. This flavor (older version) of Docker will create a VM in VitualBox instead of Hyper-V. I couldn't use Docker for Windows like on my Surface, because the two virtualization softwares don't run side by side.

I also wanted to use only tools available on each of this platform, so I decided not to use Visual Studio IDE (the big one). Moreover, I wanted to understand what was happening so I didn't want too much magic involve. Visual Studio is a fantastic tool and I love it. :)

Installing Docker


I needed to install Docker on my Surface. I downloaded Docker Community Edition (CE), and because Hyper-V was already installed everything ran smoothly. On Windows, you need to share the "C" drive from the Docker setting. However, I was getting a strange "bug" when trying to share mine. It was asking my to login with AzureAD and was ignoring my request by letting the share drive uncheckeddockerazureadcredentials.

Thanks to my new friend Tom Chantler, I did search for too long. See the thing is I'm using an AzureAD account to login, and something is not working right at the moment. As explained in Tom's post: Sharing your C drive with Docker for Windows when using Azure Active Directory, to walkaround this situation, I only had to create a new user account with the exact name as my AzureAD account, but without the AzureAD prefix (ex: AzureAD\FBoucher became FBoucher). Once that was done I could share the drive without any issue.

Let's get started with the documentation


The HelloWord container worked like a charm, so I was ready to create my Asp.Net Core website. My reflex was to go on docs.docker.com and follow the instruction from Create a Dockerfile for an ASP.NET Core application. I was probably doing something wrong, because it didn't work. So I decided to start from scratch and do every step manually... I always learn more that way.

Let's start by the beginning


Before moving everything in a container, we need a web application. This can be easily done from the terminal/ command prompt, with the commands:

dotnet new mvc -o dotnetcoredockerappservicedemo

cd dotnetcoredockerappservicedemo

dotnet restore

dotnet publish -c release -o app/ .

Here we create a new folder with a website using the mcv template. I then go in that new folder and restore the Nuget package. To test the we site locally simply use dotnet run. And finally, we build and publish the application into the subfolder app.


Moving to Docker


Now that we have our app it's time to containerize it. We need to add some Docker instruction in a dockerfile. Add a new file name dockerfile (no extension) to the root folder and copy/paste these commandes:

# dockerfile

FROM microsoft/dotnet:2.1-aspnetcore-runtime 
WORKDIR /app COPY /app /app 
ENTRYPOINT [ "dotnet" , "dotnetcoredockerappservicedemo.dll"]

To start Docker with Docker Tool just start the Docker Quickstart Terminal
This instruction will specify how to build our container. First, it will download the image microsoft/aspnetcore or microsoft/dotnet:2.1-aspnetcore-runtime. We specify the work directory, then copy the app folder to app folder inside the container. Finally, we specify the entry point of our application telling it to start with dotnet.
Like Git and it's gitIgnore file docker has the same thing with .dockerignore (no extension). Add that file into your folder to ignore the bin and obj folder.


# .dockerignore
bin\ obj\

Now that the instructions about how to build our container are completed, we can build our container. Execute the following command:

docker build -t dotnetcoredockerappservicedemo .

This will build dotnetcoredockerappservicedemo from the current folder.

Running Docker container locally


Everything is in place, the only thing missing is to run it. If you want to run it locally just go with this command:

docker run -p 8181:80 dotnetcoredockerappservicedemo

On my machine, the port 80 is always used. So I remap the port 80 to 8181, feel free to change it at your convenience. The website will be available at localhost:8181

If you are running Docker Tool (older version of Docker), you need to get the IP of your VM. To get it do

docker-machine ip

Running in the cloud


To run our container into Azure you will need to publish it to the cloud first. It could be on DockerHub or in a private registry on Azure. I decided to go with Azure. First, we need to create a registry, then publish our container.

az group create --name dotnetcoredockerappservicedemo --location eastus

az acr create --resource-group dotnetcoredockerappservicedemo --name frankContainerDemo01 --sku Basic --admin-enabled true

az acr credential show -n frankContainerDemo01

The last command az acr credential show will provides information to tag our container with our repository name and also gives us the credential to be able to push. Of course, you could go to the portal.azure.com and get the information from the Registry's Access Keys blade.

docker tag dotnetcoredockerappservicedemo frankcontainerdemo01.azurecr.io/dotnetcoredockerappservicedemo:v1

Let's connect our docker to our registry, and then push (upload) our container to Azure.


# The https:// is important...

docker login https://frankcontainerdemo01.azurecr.io -u frankContainerDemo01 -p <Password_Retreived>

docker push frankcontainerdemo01.azurecr.io/dotnetcoredockerappservicedemo:v1


Great the container is in Azure. Now let's create a quick webApp to see it. We could also use the Azure Container Instance (ACI) that would be only one command, but because the demo is a website, it won't make sense to use ACI for that.

To get an Application service, we need a Service plan, and then we will create an "empty" webapp. To do that we will specify the runtime without providing any code/binary/container. I wasn't able to create a webapp from a private Azure registry in one command, so this is why I'm doing it in two.

az appservice plan create --name demoplan --resource-group dotnetcoredockerappservicedemo --sku S1 --is-linux

az webapp create -g dotnetcoredockerappservicedemo -p demoplan -n frankdockerdemo --runtime "DOTNETCORE|2.1"

On Windows, I got the following error message: '2.1' is not recognized as an internal or external command, operable program or batch file. The PowerShell command line escape "--%" solves the problem: az --% webapp create -g dotnetcoredockerappservicedemo -p demoplan -n frankdockerdemo --runtime "DOTNETCORE|2.1"

If you check the website right now you should have page saying that the site is up but empty. Let's update the container settings with our registry and container settings.

az webapp config container set -n frankdockerdemo -g dotnetcoredockerappservicedemo --docker-custom-image-name frankcontainerdemo01.azurecr.io/dotnetcoredockerappservicedemo:v1 --docker-registry-server-url https://frankcontainerdemo01.azurecr.io --docker-registry-server-user frankContainerDemo01 --docker-registry-server-password <Password_Retreived> 

It's works of course!
final2


Conclusion


It's only four steps: create the .Net Core application, package it into a Docker container, publish our container into our Azure Registry, and create an application service base on that container. However, because all this tech are cross-platform, sometimes you get some little tiny differences between the platform, and those could become time-consuming. It was a great little project that turned out to be a lot more than expected, but I learn so much!

I'm very happy with the result... expect more of Docker in the future!


In a video, please!


I also have a video of this post if you prefer.




References

Reading Notes #339

IMG_20180725_154113

Cloud



Programming



Data



Miscellaneous



~Enjoy!


Reading Notes #333

flag-28555_640Cloud


Programming


Data


Reading Notes #331

IMG_20180609_102403-EFFECTS

Cloud


Programming



Books



Miscellaneous



Reading Notes #326

retro-1291738_640Cloud


Programming


Miscellaneous


Books



Start with Why: How Great Leaders Inspire Everyone to Take Action
Author: Simon Sinek
- Fantastic book. To read a read-again to really capture and validate our path, our why.














How to copy files between Azure subscription from Windows, Linux, OS X, or the cloud

(en français: ici)

Copy, Download or Upload from-to any combination of Windows, Linux, OS X, or the cloud


Data is and will always be our primary concern. Whether shaped as text files, images, VM VHDs or any other ways, at some point in time, our data will need to be moved. I already wrote about it previously, and the content of this post is still valuable today, but I wanted to share new options and convert all ground (meaning Linux, Windows and OS X).

Scenarios


Here few scenarios why you would want to move data.
  • Your Microsoft Azure trial is ending, and you wish to keep all the data.
  • You are creating a new web application, and all those images need to be moved to the Azure subscription.
  • You have a Virtual Machine that you would like to move to the cloud or to a different subscription.
  • ...

AZCopy


AzCopy is a fantastic command-line tool for copying data to and from Microsoft Azure Blob, File, and Table storage. At the moment, to write this post AzCopy is only available for Windows users. Another solution will be introduced later in this post for Mac and Linux users. Before AzCopy was only available on Windows. However, recently a second version built with .NET Core Framework is available. The commands are very similar but not exactly the same.

AzCopy on Windows


In his simplest expression, an AzCopy command looks like this:
AzCopy /Source:<source> /Dest:<destination> [Options]
If you earlier have installed an Azure SDK on your machine, you already have it. By default, AzCopy is installed to %ProgramFiles(x86)%\Microsoft SDKs\Azure\AzCopy (64-bit Windows) or %ProgramFiles%\Microsoft SDKs\Azure\AzCopy (32-bit Windows).

If you need only AzCopy for a server, you can download the latest version of AzCopy.
Let's see some frequent usage. First let's say you need do move all those images from your server to an Azure blob storage.
AzCopy /Source:C:\MyWebApp\images /Dest:https://frankysnotes.blob.core.windows.net/blog /DestKey:4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== /S
CopyAllImages

Then to copy those images to another subscription very easy.
AzCopy /Source:https://frankysnotes.blob.core.windows.net/blog /Dest:https://frankshare.blob.core.windows.net/imagesbackup /SourceKey:4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== /DestKey:EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== /S

AzCopy Parameters


These examples were simple, but AzCopy is a very powerful tool. I invite you to type one of the following commands to discover more about using AzCopy:
  • For detailed command-line help for AzCopy: AzCopy /?
  • For command-line examples: AzCopy /?:Samples

AzCopy on Linux


Before you could install AzCopy you will need to install .Net Core. This is done very simply with few commands.
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg
sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-xenial-prod xenial main" > /etc/apt/sources.list.d/dotnetdev.list'
sudo apt-get update
sudo apt-get install dotnet-sdk-2.0.2
Then to install it, you just need to get it with a wget command, unzip it, and execute the install script.
wget -O azcopy.tar.gz https://aka.ms/downloadazcopyprlinux 
tar -xf azcopy.tar.gz 
sudo ./install.sh
In his simplest expression, the .Net Core version of AzCopy command looks like this:
azcopy --source <source> --destination <destination> [Options]
It is very similar to the original version, but parameters are using -- and - instead of the / and where a : was required, it's now a simple space.

Uploading to Azure


Here an example, to copy a single file GlobalDevopsBootcamp.jpg to an Azure Blob Storage. We pass the full local path to the file into --source, the destination is the full URI, and finally the destination blob storage key. Of course, you could also use SAS token if you prefer.
azcopy \
--source /home/frank/demo/GlobalDevopsBootcamp.jpg \
--destination https://frankysnotes.blob.core.windows.net/blog/GlobalDevopsBootcamp.jpg \
--dest-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== 

Copying Between Azure Subscriptions


To copy the image to a second Azure subscription, we use the command the source is now an Azure Storage URI, and we pass the source and the destination keys:
azcopy \
--source https://frankysnotes.blob.core.windows.net/blog/GlobalDevopsBootcamp.jpg \
--destination https://frankshare.blob.core.windows.net/imagesbackup/GlobalDevopsBootcamp.jpg \
--source-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== \
--dest-key EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== 

Azure CLI


Azure CLI is a set of cross-platform commands for the Azure Platform. It gives tools to manipulate all Azure components, but this post will focus on azure storage features.

There are two versions of the Azure Command-Line Interface (CLI) currently available:

  • Azure CLI 2.0: written in Python, conpatible only with the Resource Manager deployment model.
  • Azure CLI 1.0: written in Node.js, compatible with both the classic and Resource Manager deployment models.

Azure CLI 1.0 is deprecated and should only be used for support with the Azure Service Management (ASM) model with "classic" resources.

Installing Azure CLI


Let's start by installing Azure CLI. Of course, you can download an installer but since everything is evolving very fast with not getting it from Node Package Manager (npm). The install will be the same, you just need to specify the version if you absolutely need Azure CLI 1.0.

sudo npm install azure-cli -g

AzureCli_installed

To keep the previous scenario, let's try to copy all images to a blob storage. Unfortunately, Azure CLI doesn't offer the same flexibility as AzCopy,and you must upload the file one by one. However, to upload all images from a folder, we can easily put the command in a loop.

for f in Documents/images/*.jpg
do
   azure storage blob upload -a frankysnotes -k YoMjXMDe+694FGgOaN0oaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQWvGrWnWj5m4X1U0HIPUIAA==  $f blogimages
done

azurecli_allimages

In the previous command -a was the account name, and -k was the Access key. This two information can easily be found in the Azure portal. From the portal (https://portal.azure.com), select the storage account. In the right band click-on Access keys.

StorageAccessKeys

To copy a file (ex: a VM disk aka VHD) from one storage to another one in a different subscription or region, it's really easy. This time we will use the command azure storage blob copy start and the -a and -k are related to our destination.

azure storage blob copy start 'https://frankysnotes.blob.core.windows.net/blogimages/20151011_151451.MOV' imagesbackup -k EwXpZ2uZ3zrjEbpBGDfsefWkj3GnuFdPCt2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== -a frankshare

The nice thing about this command is that it's asynchronous. To see the status of your copy just execute the command azure storage blob copy show

azure storage blob copy show -a frankshare -k YoMjXMDe+694FGgOaN0oPaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQVxsqhWvGrWnWj5m4X1U0HIPUIAA== imagesbackup 20151011_151451.MOV

CopyStatus1


CopyStatus2

Azure CLI 2.0 (Windows, Linux, OS X, Docker, Cloud Shell)


The Azure CLI 2.0 is Azure's new command-line optimized for managing and administering Azure resources that work against the Azure Resource Manager. Like the previous version, it will work perfectly on Windows, Linux, OS X, Docker but also from the Cloud Shell!


Cloud Shell is available right from the Azure Portal, without any plugging.

Uploading to Azure


The command if the same as the previous version except that now the command is named az. Here an example to upload a single file into an Azure Blob Storage.

az storage blob upload --file /home/frank/demo/CloudIsStrong.jpg \
--account-name frankysnotes \
--container-name blogimages --name CloudIsStrong.jpg \
--account-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA==

Copying Between Subscriptions


Let's now copy the file to another Azure subscription. A think to be aware is that --account-name and --account-key are for the destination, even if it's not specified.
az storage blob copy start \
--source-account-name frankysnotes  --source-account-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== \
--source-container blogimages --source-blob CloudIsStrong.jpg   \
--account-name frankshare  --account-key EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== \
--destination-container imagesbackup  \
--destination-blob CloudIsStrong.jpg 

In Video Please!


If you prefer, I also have a video version of that post.



One More Thing


Sometimes, we don't need to script things, and a graphic interface is much better. For this kind of situation, the must is the Azure Storage Explorer. It does a lot! Upload, download, and manage blobs, files, queues, tables, and Cosmos DB entities. And it works on Windows, macOS, and Linux!


It's just the beginning


This post was just an introduction to two very powerful tools. I strongly suggest to go read in the official documentation to learn more. Use the comment to share all your questions and suggestion.

References:

Reading Notes #315

36145725596_6536a4f8aa_cSuggestion of the week



Cloud


Programming


Databases


Miscellaneous





Photo credit: nickpettican on Visual Hunt / CC BY-NC-ND

Reading Notes #308

2017-12-02_00-10-58Suggestion of the week


Cloud


Programming


Miscellaneous


Reading Notes #300

300loveCloud


Programming


Data


Miscellaneous



Reading Notes #297

Jekyll_AppService

Suggestion of the week

  • How to uninstall Scrum (Erwin Verweij) - When you will read that post (because you must read it... Seriously), you will smile, giggle and maybe even laugh.


Cloud


Programming


Miscellaneous





Reading Notes #289

IMG_20170710_092837

Cloud


Programming


Databases

  • Migrating to Azure SQL Database (Gavin Payne) - Very interesting and complete post that regroups references and gives details about some of the alternatives when it's migration time.


Reading Notes #276

roslynSuggestion of the week


Cloud


Programming


Miscellaneous

Reading Notes #275

IMG_20170409_152323Cloud


Programming


Miscellaneous



~Frank



Reading Notes #267

IMG_20170208_201247Cloud


Programming


Miscellaneous




Reading Notes #247

23Cloud


Programming


Miscellaneous

Reading Notes #246

IMG_20160826_115405Cloud


Programming


Databases


Miscellaneous


Create and Deploy .NET Core WebApp to Azure from Linux

(Ce billet en aussi disponible en français.)

The other day, I was glued to my PC, and I had spare time (yah, I know very unusual). Since .Net Core 1.0 was just released few days before, I decide to give it a try. To add an extra layer of fun in the mix, I decided to do it from my Ubuntu VM. In less than 15 minutes, everything was done! I was so impressed I knew I needed to talk about it. That's exactly what this post is about.

The preparation

Before we get started, it's important to know which version of Ubuntu you are using, because some commands will be slightly different. To know the version you are running you simply need to click the gear in the top right of the desktop screen and select About this Computer. In my case, since I'm using Ubuntu 14.04 LTS, I will be using command related to this version. If you are using a different version, please refer to .NET Core documentation.

ubuntu_version

Now we need to install .Net Core. Nothing more easy. Open a Terminal (Ctrl+Alt+T) and type those three commands:

# Setup the apt-get feed adding dotnet as repo
sudo sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet/ trusty main" > /etc/apt/sources.list.d/dotnetdev.list'
apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893

# Get the latest packages
apt-get update

# Install .NET Core SDK
sudo apt-get install dotnet-dev-1.0.0-preview2-003121
Once it's all done you can type dotnet --info and you should see something like that.

dotnet_info


Create the Local WebApp

From the same Terminal we will create an empty folder for our solution and move into it. Execute these commands.
mkdir demodotnetweb
cd demodotnetweb
We now want to create our new web project. This is done by the command dotnet new, but we need to specify the type otherwise it will create a console application.
dotnet new -t web
Now to download all the references (nuget packages) of our project required, execute the following command:
dotnet restore
Base on the speed of your Internet connection and how many dependencies are required, this can take from few seconds to more than one minute.
To test if our solution works locally type the command:
dotnet run
That will compile our solution and start an AspNetCore Internal hosting. Launch a web browser and go to http://localhost:5000 to see the App in action.

dotnetcore_localhost

Deploy to Azure

To deploy our new project to the cloud, we will use the continuous deployment capability of Azure. First, navigate to portal.azure.com and create a Web App.

create_webApp

Once the the application is created, you should see the This web app as been created message when you navigate to the [nameofWebApp].azurewebsites.net

successfully_created

It's now time to add a local Git repository. In the WebApp Settings select Deployment source. Click on the Configure Required Settings, then select the Local Git Repository option.

add_source_control

After setting the user credential for the repository, we can get the URL from the Essential section.

repourl

Back to our Ubuntu Terminal, we will have to execute these commands:

# create a git repository
git init
# commit all files
git commit -m "Init"

# Add the remote repository
git remote add azure https://username@demowebcore.scm.azurewebsites.net:443/demowebcore.git

# Push the code to the remote
git push azure master
After a minute or so you should have your WebApp online!

dotnetcore_azure


Voila! That was fun!.

Automating Docker Deployment with Azure Resource Manager

Recently, I had to build a solution where Docker container were appropriate. The idea behind the container is that once there are built you just have to run it. While it's true, my journey was not exactly that, nothing dramatic, only few gotchas that I will explain while sharing my journey.

The Goal

The solution is classic, and this post will focus on a single Virtual Machine (VM). The Linux VM needs a container that automatically runs when the VM starts. Some files first download from a secure location (Azure blob storage) then passed to the container. The solution is deployed using Azure resources manager (ARM). For the simplicity, I will use Nginx container but the same principle applies to any container. Here is the diagram of the solution.

Docker-in-Azure2

The Solution

I decided to use Azure CLI to deploy since this will be also the command-line used inside the Linux VM, but Azure PowerShell could do the same. We will be deploying an ARM template containing a Linux VM, and two VM-Extension: DockerExtension and CustomScriptForLinux. Once the VM is provisioned, a bash script will be downloaded by CustomScriptForLinux extension from the secure Azure blob storage myprojectsafe, then executed.

Reading Notes #222

hourglassSuggestion of the week


Cloud


Data