Showing posts with label post. Show all posts
Showing posts with label post. Show all posts

How create a static website on Azure Storage

I have been waiting for this feature for so long! I know; it's not a major feature, but it fills an important gap in the Azure offer. We can now create static websites in the Azure Blob Storage (as I'm writing this post the service is still in preview). In this post, I will explain why I think it's a really good news, show how to create and publish on a static website.

Why It's an Awesome News

The cloud is the perfect place when you need to build something huge very quickly. It's also an excellent solution when you have a lot of variance in the number of resources it required. Because Azure is a service, it will provide you as many resources as you would like in few minutes. And when you are done with the resources you stop paying for them; and it's really great like that!
However, if the only thing you need was to host a little something like a blog or a little website for an event or some temporary publicity Azure was not the best place for it. I mean yes of course, you could build a service and host many little websites on it (Scott Hanselman as excellent posts about that like this one), but it felt always a bit overkill for most of the users. Some people kept an "old style" host provider just for that. I mean it's fine, it works... But with Azure storage, it will be really reliable, and at a lower cost! Let's see how we can create one.

Create a Static Website

To have the static website feature you need to create an Azure Blob Storage account the same way you created them before, however, it needs to be of kind General Purpose V2 (GPV2). Today if you install the Azure CLI Storage-extension Preview, you can use it to create one, or simply go on the Let's use the portal since it's more visual.

Once the storage is created, open it. On the left menu of the storage blade, click on the Static website (preview) option. That will open the configuration page for our static website. First, click the Enabled button then enter the initial/ index document name (ex:index.html). Finally, click the Save button on the top of the blade.

The shell for our website is now created. A new Azure Blob Storage container named $web h been created. The Primary and secondary endpoint should now be displayed (ex: If you test this URL, you will see and message saying that the content doesn't exist... and it's normal.


Create some content

This is the part where it all depends on your needs. You may already have some HTML pages ready, or you may want to code them all yourself, or the website may previously exist. For this post, I will create a brand-new blog using a static website generator named Wyam (if you would like to see how to do it with Jekyll, another generator, I used it in the video)
To create a new template with Wyam you use the following command in a command prompt. That will create a new website in the subfolder output.
wyam --recipe Blog --theme CleanBlog

Publish to Azure

It's now time to upload our content to the Azure blob Storage. The easiest is probably directly from the portal. To upload a file, click on the $web container, then the Upload button. From the new form, select the file and upload it.

The main problem with this method is the that it only works one file at the time... And a website usually has many of those...
A more efficient way would be to use Azure Explorer or some script. Azure Explorer doesn't support yet the Azure Storage Static Website, but it will be soon. So that leads us to scripts or command lines.


I really like AZCopy as it's very efficient and easy to use. Unfortunately, as I'm writing this post, AzCopy doesn't support the Azure Storage Static Website. I try to upload all content from the output folder (and sub folders)) with a command like this, but it fails.
azcopy --source ./output --destination$web --dest-key fec1acb473aa47cba3aa77fa6ca0c5fdfec1acb473aa47cba3aa77fa6ca0c5fd== --recursive

Azure CLI

An Azure CLI extension preview is also available. Like I mentioned previously, the extension gives you the possibility to create a static website or update the configuration, to upload files you have two options the batch would be more efficient of course, but the file by file option also works. Thanks to Carl-Hugo (@CarlHugoM) for your help with those commands.

az storage blob upload-batch -s "./output" -d $"web" --account-key fec1acb473aa47cba3aa77fa6ca0c5fdfec1acb473aa47cba3aa77fa6ca0c5fd== --account-name frankdemo

az storage blob upload -f "./output/index.html" -c $"web" -n index.html ---account-key fec1acb473aa47cba3aa77fa6ca0c5fdfec1acb473aa47cba3aa77fa6ca0c5fd== --account-name frankdemo

Visual Studio Code Azure Storage Extension

I finally tried the Visual Studio Code Stogare Extension. After installing it, you need to add a User Setting Ctrl + ,. Then add "azureStorage.preview.staticWebsites" : true to your configuration. Now you just need to click on the extension, then select Azure blob storage from your subscription, and right click to be able to upload a folder.

Depending on how many files, and their sizes it will take a moment. VSCode will notify you when it's done. You will then be able to get back online and refresh your website to see the result.



I'm very happy to see that feature because it fills a need that was not really cover yet by the Microsoft offer. Right now, it's an early preview so even if the service is very stable, not all the tools support it but that only temporary. Right not you can set your custom domain name, however, HTTPS is not supported.
So what do we do with it? Should we wait or jump right on? Well as the best practices imply when a feature is in preview don't put your core business on it yet. If you are just looking to build a personal website, a little promo than... enjoy!

In video, please!

I also have a video of this post if you prefer.


How to Deploy your Azure Functions Faster and Easily with Zip Push

Azure functions are great. I used to do a lot of "csx" version (C# scripted version) but more recently I switched to the compile version, and I definitely loved it! However, I was looking for a way to keep my deployment short and sweet, because sometimes I don't have time to setup a "big" CI/CD or simply because sometimes I'm not the one doing the deployment... In those cases, I need a simple script that will deploy everything! In this post, I will share with you how you can deploy everything with one easy script.

The Context

In this demo, I will deploy a simple C# (full .Net framework) Azure functions. I will create the Azure Function App and storage using an Azure Resource Manager (ARM template) and deploy with a method named Zip push or ZipDeploy. All the code, script, a template is available on my Github.

The Azure Functions Code

The Azure Function doesn't have to be special, and it can be any language supported by Azure Functions. Simply to show you everything, here the code of my function.

namespace AzFunctionZipDeploy
    public static class Function1
        public static async Task Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
            log.Info("C# HTTP trigger function processed a request.");

            string top = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "top", true) == 0)

            if (top == null)
                dynamic data = await req.Content.ReadAsAsync< object>();
                top = data?.top;

        return top == null
                ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a number to get your top x runner on the query string or in the request body")
                : req.CreateResponse(HttpStatusCode.OK, new { message = $"Hello, here is your Top {top} runners", runners = A.ListOf(int.Parse(top)) });

    class Person
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public int Age { get; set; }

It's a really simple function that will return a list of Person generated on the fly. The list will contain as many person as the number passed in parameter. I'm using the very useful GenFu library, from my buddies: ASP.NET Monsters.

The only thing we need to do is to create our compress file (Zip or Rar) that contains everything our project required.


In this case, it's the project file (AzFunction-ZipDeploy.csproj), the function's code (Function1.cs) the host (host.json) and local settings of our function (local.settings.json).

The ARM template

For this demo, we need one Azure Function App. I will use a template that is part of the Azure Quickstart Templates. A quick look to the azuredeploy.parameters.json file and we see that the only parameter we really need to set is the name of our application.

    "$schema": "",
    "contentVersion": "",
    "parameters": {
        "appName": {
        "value": "zipdeploydemo"

To be able to ZipDeploy, we need to add one Application Setting to let the Kudu interface we need its help to compile our code. To do that let's open the azuredeploy.json and go to the appSettings section. We need to add a new variable named: SCM_DO_BUILD_DURING_DEPLOYMENT and set it to true. After adding the setting it should look like this (see the last one... that's our new one):

"appSettings": [
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    "name": "AzureWebJobsStorage",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    "value": "[toLower(variables('functionAppName'))]"
    "value": "~1"
    "value": "6.5.0"
    "value": true

The Deployment Script

Now that all the pieces are ready it's time to put it together one script. In fact, only the two last commands are required; everything else is just stuff to make it easier to re-use it. Check out my previous post 5 Simple Steps to Get a Clean ARM Template, to learn more about the best practices related to ARM template. So let's see that script, it's pretty simple.

    # script to Create an Azure Gramophone-PoC Solution



    dateToken=`date '+%Y%m%d%H%M'`

    # az login

    # You can select a specific subscription if you do not want to use the default
    # az account set -s SUBSCRIPTION_ID

    if !( $(az group exists -g  $resourceGroupName) ) then
        echo "---> Creating the Resourcegroup: " $resourceGroupName
        az group create -g $resourceGroupName -l $resourceGroupLocation
        echo "---> Resourcegroup:" $resourceGroupName "already exists."

    az group deployment create --name $deploymentName --resource-group $resourceGroupName --template-file $templateFilePath --parameters $parameterFilePath --verbose

    echo "---> Deploying Function Code"
    az functionapp deployment source config-zip -g $resourceGroupName -n zipdeploydemo --src "./zip/"

    echo "---> done <--- code="">

The only "new" thing is the last command functionapp deployment source config-zip. That where we specify to the Azure Function App to look to --src to get our source. Because I'm running it locally, the path is pointing to a local folder. However, you could execute this command also in the CloudShell, and that would become a URI... to an Azure Blob Storage by example.

Deploy and Test

If you didn't notice yet, I did my script in bash and Azure CLI. That because I want my script to be compatible with all platforms. Of course, you could have done it in PowerShell or anything else that would call the REST API.

To deploy, just execute the script passing the ResourceGroup name, and its location.

    ./ cloud5mins eastus


To get to Function URL, go to the Azure portal ( and click on the Function App that we just deploy. Click on the function GetTopRunner in this case, and click on the </> Getfunction URL button.


Use that URL in postman and pass another parameter top to see we the deployment ws successful.


In Video Please

If you prefer, I also have a video version of this post.


Does the Azure DevOps projects are worth it?

Imagine you just arrived at the office. You only took a sip or two of your coffee or tea. You look at the tasks that need to be done today (well yesterday based on the request): a new project is starting, and you need to configure everything the team needs to start building that web application. The need a repository, a continuous integration and continuous delivery (CI/CD) pipeline, a place to deploy, monitoring tools, and of course you need to create an environment where they will be able to track their work. Should you panic? No, because you will use the new Azure DevOps Project available in Azure.

Let's Create the project

From the Azure portal ( click on the plus button and search for "devops". Select DevOps Project, then click on the Create button. Then follow the five steps and Azure will create everything for you.

What is deployed

  • Your application from many popular frameworks
  • Automatic full CI/CD pipeline integration
  • Monitoring with Application Insights
  • Git Repository
  • Tasks/ Bugs tracking board
  • Deployment to the platform of your choice

In Video please!


The DevOps projects are really fantastic and are very useful. The fact that everything is all packaged together and automatically deployed is a considerable time saver. In short, are the Azure DevOps projects worth it? Oh yeah!

5 Simple Steps to Get a Clean ARM Template

You have a solution that is already deployed in Azure, and you would like to reproduce it. You know that Azure Resource Manager (ARM) template could help you to do that, unfortunately, you don't know how to get started. In this post, I will share with you the best practices and how I implement them while working on ARM template.

How to Get your ARM Template

Of course, you could build your ARM template from scratch. However, there many quickstart templates available on GitHubd. Even more, you could also get Azure to generate the template for you!

If your building a new solution, go in the Azure portal ( and start creating your resource as usual. But stop just before clicking on the Create button. Instead click on the link on his side named Download template and parameters. That will open a new blade where you will be able to download the template, parameters files, and a few scripts in different languages to deploy it.


If your solution is already deployed, you still have a way to get the template. Again, from the Azure portal, go to the resource group of your solution. In the left option panel, click on Automation script.


Step 1 - Use Git

Once you have your ARM template and a parameter file, move them in a folder and initialize a Git Repository. Even if it's only a local one this will give you an infinite of Ctrl-Z. Doing multiple commit along your journey to get a better and cleaner template, you will always have options to get back when your template was "functional".

A fantastic tool to edit ARM template is Visual Studio Code. It's free, it supports natively Git, and you can install great extensions to help you.

Step 2 - Validate, Validate, Validate, then Commit

az group deployment validate --resource-group cloud5mins --template-file .\template.json --parameters .\parameters.json

Step 3 - Reduce the Number of Parameters

Nobody like tons of questions. Too many parameters is exactly like too many questions. So reduce them to the maximum. We cannot just delete those unwanted parameters, but they are still providing important information. Instead move them in the variables section.

You can do that in different ways, let me share mine. I start with the parameter files and bubble-up any parameter that I would like to keep. Next Cut/Paste all the unwanted parameters to a new file. Then I use the multi-cursor selection of VSCode to clean them in 2 clicks.

Once we have all parameters "converted" in variables, copy them into the variables section of the ARM template. You will need to delete the parameter equivalent from the top of the template.

Now that we have a clean list of parameters, and variables, we must fix the references to the converted parameters. To do that replace all

parameters() references by variables().

For exemple this:


will become that:


Now that we have a more respectable list of parameters, we must be sure that what we expect from them is clear. To do that we have two simple feature at our disposal. The first one of course the name. Use a complete and clear name. Resist the temptation to shorten everything or use too many acronyms. The second is to use metadata description. This information will be displayed to users through the portal as tooltips.

    "adminUsername": {
        "type": "string",
        "metadata": {
            "description": "Name of Administrator user on the VM"

Step 4 - Use Use Unique String

When you deploy in Azure some names are global, and by definition need to be unique. This is why adding a suffix or a unique identifier to your named is a good practice. An excellent way to get an identifier is to use the function uniqueString(). This function will create a 64Bits hash based on the information passed in parameter.

"suffix": "[uniqueString(resourceGroup().id, resourceGroup().location)]"

In the example just above, we pass the identifier of the resource group and its name. It means that every time you will be deploying in the same resource group and at that location suffix will be the same. However, if your solution is deployed in multiple locations (for a disaster recovery, or another scenario), suffix will have a different value.

To use it, let's say the name of a virtual machine was passed as a parameter. Then we will create a variable and concatenate the parameter and our suffix.

"VMName": "[toLower(concat(parameters('virtualMachineName'), variables('suffix')))]",

Then instead of using the parameter inside your ARM template, you will be using this new variable.

Step 5 - Use Variables

One of the great strengths of using ARM template is that we can use them over and over. This is why we want to avoid anything that his static name or value. When we generated template from the Azure portal, these templates are a snapshot of that particular instances. The best way to stay structured and avoid too fixed names is to leverage variables.

When you use an ARM template generated from a "live" and already deployed solution the ARM will contains a lot of very specific information about this instance (Comments, ResourceIDs, States, etc.). When you are building a generic template don't hesitate to delete those.
Let's see some examples.

"RGName": "[toLower(resourceGroup().name)]",
"VMName": "[toLower(concat(parameters('virtualMachineName'), variables('suffix')))]",

"virtualNetworkName": "[concat(variables('RGName'), '-vnet')]",
"networkInterfaceName": "[toLower(concat(variables('VMName'),'-nic-', variables('suffix')))]",
"networkSecurityGroupName": "[toLower(concat(variables('VMName'),'-nsg-', variables('suffix')))]",

"diagnosticsStorageAccountName": "[substring(concat(variables('RGName'), 'diag', variables('suffix')), 0, 24)]",

You may wonder why we need the first variable RGName , since the resource group name is already available through the resourceGroup() function? Some resources, like Azure Blob Storage's name, must only contain lowercase characters. By making a variable we avoid repeating the to toLower() every time.

You can concatenate two, or more variables and/or string with the "very popular" function concat(). Sometimes, the name built by all those string is too long. You can trim it by using the function substring(stringToParse, startIndex, length). In this case, the Azure Blob Storage required a name with a maximum of 24 characters.

To learn more about all the available function and how to use it visit the Azure Resource Manager template functions page from the Microsoft documentation.

Step 6 - Create "T-Shirt Size" or smart options

The best way to build a good template is to think like the people who will use it. Therefore, a developer may not know what the difference between a Standard_D2s_v3, a Standard_F8 or a Standard_H8. But will clearly know if he needs a medium, a large, or a web development VM.

That means that we will create a parameter with only specific values allowed, and base on that simple selection we will take more specific and technical decision. See the declaration of the following parameter.

    "EnvironmentSize": {
        "type": "string",
        "defaultValue": "medium",
        "allowedValues": [
        "metadata": {
            "description": "Medium for regular development. Large for huge memory usage"

This parameter will only allowed two string "medium" or "large", anything else will return a validation error. If nothing is passed the default value will be "medium". And finally using a metadata description to make sure the purpose of the parameter is clear and well defined.

Then you define your variable (ex: TS-Size) as an object with two properties, or as many as you have allowed values. For each of these properties, you could have many other properties.

        "VMSize": "Standard_D2s_v3",
        "maxScale": 1
        "VMSize": "Standard_D8s_v3",
        "maxScale": 2

Then to use it, we just need to chained the variables and parameter. Notice how we have nested square brackets... This will use the TS-Size.medium.VMSize value by default.

"vmSize": "[variables('TS-Size')[parameters('EnvironmentSize')].VMSize]"

I hope you will find those tips as useful, as I found they are. If you have other suggestions or recommendations, don't hesitate to add them in the comment section or reach me out.

The full ARM template is available at :

In Video Please!

If you prefer, I also have a video version of that post.

Don't install your software yourself

I don't know for you, but I don't like losing time. This is why a few years ago I started using scripts to install all the software I need on my computer. Got a new laptop? N You just need to execute this script, go grab a coffee and when I'm back all my favorite (and required) softwares are all installed. On Linux, you could use apt-get, and on Windows, my current favorite is Chocolatey. Recently I needed to use more virtual machine (VM) in the cloud and I deceided that I should try using a Chocolatey script during the deployment. This way once the VM is created the softwares, I need is already installed! This post is all about my journey to get there, all scripts, issues and workarounds will be explained.

The Goal

Creating a new VM on premises applying the OS update and installing all the tools you need (like Visual Stutio IDE) will takes hours... This solution should be done under 10 minutes (~7min in my case).
Once the VM is available, it should have Visual Studio 2017 Enterprise, VSCode, Git and Node.Js installed. In fact, I would like to use the same Chocolatey script I use regularly.
# Install Chocolatey
Set-ExecutionPolicy Bypass -Scope Process -Force; iex ((New-Object System.Net.WebClient).DownloadString(''))

# Install Software
choco install visualstudiocode -y
choco install git -y 
choco install nodejs-lts  -y

(Available on gist.github)

The Tools

In this post I will use Azure CLI, because it will works on any environment. However, PowerShell can also be use only a few command will be different. The VM will be deploy with an Azure resource Manager (ARM) template. To create and edit the ARM template I like to use VSCode, you don't need it but it's so much easier with it! I use two extension.
The first one Azure Resource Manager Snippets will help by generating the schema for our needs. In a JSON file you just need to type arm en voila! You ahave a long list of ARM template!


The second is Azure Resource Manager Tools. This extension provides language support for ARM and some validate. Very useful...


Creating the ARM Template

To Get started create a new JSon file. Then type arm and select the first option; to get an empty skeleton. Then add an extra line in resources and type again arm. This time scroll until you see arm-vm-windows.


A multi-cursor will allow you to edit the name of your VM everywhere in the file in one shot. Hit Tab to navigate automatically to the userName, and Tab again to go to the password.

Now we have a functional ARM template that we could deploy. However, let's add a few things first.

Searching the Image SKUs by Code

One of my favorite VM images for a DevBox is the one that includes Visual Studio pre-installed. One thing to know is those images are only deployable in an MSDN subscription. To specify wich image you want to use you need to pass a publisher, offer, and sku.
Here how to do it with Azure CLI commands
# List all the Publishers that contain VisualStudio (It's case sensitive)
az vm image list-publishers --location eastus --output table --query "[?contains(name,'VisualStudio')]"

# List all offers for the Publisher MicrosoftVisualStudio
az vm image list-offers --location eastus --publisher MicrosoftVisualStudio  --output table

# List all availables SKUs for the Publisher MicrosoftVisualStudio with the Offer VisualStudio
az vm image list-skus --location eastus --publisher MicrosoftVisualStudio --offer VisualStudio --output table

Now that all the information is found, search in the ARM template and replace the current values by the one found. In my case, here are the new values.

"imageReference": {
                    "publisher": "MicrosoftVisualStudio",
                    "offer": "VisualStudio",
                    "sku": "VS-2017-Ent-Win10-N",
                    "version": "latest"

Adding our Custom Script

Great now we have a VM with Visual Studio but our applications are still not installed. That will be done by adding the Custom Script Extension for Windows to our template. documentation page, a sample schema is there ready to be use.
The last node of your template is currently another extension. For the purpose of this blog post let's remove it. You should have something like this.


We will copy/ paste the snippet from the documentation page a change a few little things. Change the type (thank to our VSCode Extension for that catch). Update the dependencies to reflet our demo.

To use the extension your script needs to be available online. It could be in a blob storage (with some security) or just publicly available. In this case, the script is publicly available from my gist.github page. I created a variable in the variables section that contains the RAW URL of my script, and a reference to that varaibale is used in the fileUris.

The extension will download the script and then execute a function locally. Change the commandToExecute to call our script with unrestricted execution policy.

You have a timed window of ~30 minutes to execute your script. If it takes longer then that, your deployment will fail.

        "apiVersion": "2015-06-15",
        "type": "extensions",
        "name": "config-app",
        "location": "[resourceGroup().location]",
        "dependsOn": [
            "[concat('Microsoft.Compute/virtualMachines/', 'FrankDevBox')]"
        "tags": {
            "displayName": "config-app"
        "properties": {
            "publisher": "Microsoft.Compute",
            "type": "CustomScriptExtension",
            "typeHandlerVersion": "1.9",
            "autoUpgradeMinorVersion": true,
            "settings": {
                "fileUris": [
            "protectedSettings": {
                "commandToExecute": "[concat('powershell -ExecutionPolicy Unrestricted -File ', './SimpleDevBox.ps1')]"

The ARM Template

It's finally time to deploy our VM.

# First, we need a Resource Group
    az group create --name frankDemo --location eastus

    # ALWAYS, always validate first... you will save a lot of time
    az group deployment validate --resource-group frankDemo --template-file /home/frank/Dev/DevBox/FrankDevBox.json

    #Finally deploy. This script should take between 5 to 10 minutes
    az group deployment create --name FrankDevBoxDemo --resource-group frankDemo --template-file /home/frank/Dev/DevBox/FrankDevBox.json --verbose

What's Next?!

We created one template; you could make it better.

Deploy from anywhere

By moving the computerName, adminUsername, adminPassword, and the script url in the parameters section, you could then put the template in a public place like GitHub. Then with use the one click deploy!

Directly from the Github page or from anywhere you just need to build a URL from those two parts: and the HTML Encoded URL to your template.

If my template is available at then the full url become:

Clicking that URL will bring you to the Azure Portal ( in a customized form to deploy your template.


It cannot be easier! You can see mine on GitHub.

Auto shutdown

It's very easy to forget to turn off those VM. And whatever you are paying for them or your using the limited MSDN credit it's a really good practice to turn them down. Why not do that automatically!
That can be very simply done by adding a new resource in the template.

        "name": "[concat('autoshutdown-', 'FrankDevBox')]",
        "type": "Microsoft.DevTestLab/schedules",
        "apiVersion": "2017-04-26-preview",
        "location": "[resourceGroup().location]",
        "properties": {
            "status": "Enabled",
            "taskType": "ComputeVmShutdownTask",
            "dailyRecurrence": {
                "time": "19:00"
            "timeZoneId": "UTC",
            "targetResourceId": "[resourceId('Microsoft.Compute/virtualMachines', 'FrankDevBox')]",
            "notificationSettings": {
                "status": "Enabled",
                "emailRecipient": "",
                "notificationLocale": "en",
                "timeInMinutes": "30"
        "dependsOn": [
            "[concat('Microsoft.Compute/virtualMachines/', 'FrankDevBox')]"

In Video Please!

If you prefer, I also have a video version of that post.

How to Create an Azure VM with Chocolatey



How to copy files between Azure subscription from Windows, Linux, OS X, or the cloud

(en français: ici)

Copy, Download or Upload from-to any combination of Windows, Linux, OS X, or the cloud

Data is and will always be our primary concern. Whether shaped as text files, images, VM VHDs or any other ways, at some point in time, our data will need to be moved. I already wrote about it previously, and the content of this post is still valuable today, but I wanted to share new options and convert all ground (meaning Linux, Windows and OS X).


Here few scenarios why you would want to move data.
  • Your Microsoft Azure trial is ending, and you wish to keep all the data.
  • You are creating a new web application, and all those images need to be moved to the Azure subscription.
  • You have a Virtual Machine that you would like to move to the cloud or to a different subscription.
  • ...


AzCopy is a fantastic command-line tool for copying data to and from Microsoft Azure Blob, File, and Table storage. At the moment, to write this post AzCopy is only available for Windows users. Another solution will be introduced later in this post for Mac and Linux users. Before AzCopy was only available on Windows. However, recently a second version built with .NET Core Framework is available. The commands are very similar but not exactly the same.

AzCopy on Windows

In his simplest expression, an AzCopy command looks like this:
AzCopy /Source:<source> /Dest:<destination> [Options]
If you earlier have installed an Azure SDK on your machine, you already have it. By default, AzCopy is installed to %ProgramFiles(x86)%\Microsoft SDKs\Azure\AzCopy (64-bit Windows) or %ProgramFiles%\Microsoft SDKs\Azure\AzCopy (32-bit Windows).

If you need only AzCopy for a server, you can download the latest version of AzCopy.
Let's see some frequent usage. First let's say you need do move all those images from your server to an Azure blob storage.
AzCopy /Source:C:\MyWebApp\images /Dest: /DestKey:4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== /S

Then to copy those images to another subscription very easy.
AzCopy /Source: /Dest: /SourceKey:4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== /DestKey:EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== /S

AzCopy Parameters

These examples were simple, but AzCopy is a very powerful tool. I invite you to type one of the following commands to discover more about using AzCopy:
  • For detailed command-line help for AzCopy: AzCopy /?
  • For command-line examples: AzCopy /?:Samples

AzCopy on Linux

Before you could install AzCopy you will need to install .Net Core. This is done very simply with few commands.
curl | gpg --dearmor > microsoft.gpg
sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg
sudo sh -c 'echo "deb [arch=amd64] xenial main" > /etc/apt/sources.list.d/dotnetdev.list'
sudo apt-get update
sudo apt-get install dotnet-sdk-2.0.2
Then to install it, you just need to get it with a wget command, unzip it, and execute the install script.
wget -O azcopy.tar.gz 
tar -xf azcopy.tar.gz 
sudo ./
In his simplest expression, the .Net Core version of AzCopy command looks like this:
azcopy --source <source> --destination <destination> [Options]
It is very similar to the original version, but parameters are using -- and - instead of the / and where a : was required, it's now a simple space.

Uploading to Azure

Here an example, to copy a single file GlobalDevopsBootcamp.jpg to an Azure Blob Storage. We pass the full local path to the file into --source, the destination is the full URI, and finally the destination blob storage key. Of course, you could also use SAS token if you prefer.
azcopy \
--source /home/frank/demo/GlobalDevopsBootcamp.jpg \
--destination \
--dest-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== 

Copying Between Azure Subscriptions

To copy the image to a second Azure subscription, we use the command the source is now an Azure Storage URI, and we pass the source and the destination keys:
azcopy \
--source \
--destination \
--source-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== \
--dest-key EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== 

Azure CLI

Azure CLI is a set of cross-platform commands for the Azure Platform. It gives tools to manipulate all Azure components, but this post will focus on azure storage features.

There are two versions of the Azure Command-Line Interface (CLI) currently available:

  • Azure CLI 2.0: written in Python, conpatible only with the Resource Manager deployment model.
  • Azure CLI 1.0: written in Node.js, compatible with both the classic and Resource Manager deployment models.

Azure CLI 1.0 is deprecated and should only be used for support with the Azure Service Management (ASM) model with "classic" resources.

Installing Azure CLI

Let's start by installing Azure CLI. Of course, you can download an installer but since everything is evolving very fast with not getting it from Node Package Manager (npm). The install will be the same, you just need to specify the version if you absolutely need Azure CLI 1.0.

sudo npm install azure-cli -g


To keep the previous scenario, let's try to copy all images to a blob storage. Unfortunately, Azure CLI doesn't offer the same flexibility as AzCopy,and you must upload the file one by one. However, to upload all images from a folder, we can easily put the command in a loop.

for f in Documents/images/*.jpg
   azure storage blob upload -a frankysnotes -k YoMjXMDe+694FGgOaN0oaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQWvGrWnWj5m4X1U0HIPUIAA==  $f blogimages


In the previous command -a was the account name, and -k was the Access key. This two information can easily be found in the Azure portal. From the portal (, select the storage account. In the right band click-on Access keys.


To copy a file (ex: a VM disk aka VHD) from one storage to another one in a different subscription or region, it's really easy. This time we will use the command azure storage blob copy start and the -a and -k are related to our destination.

azure storage blob copy start '' imagesbackup -k EwXpZ2uZ3zrjEbpBGDfsefWkj3GnuFdPCt2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== -a frankshare

The nice thing about this command is that it's asynchronous. To see the status of your copy just execute the command azure storage blob copy show

azure storage blob copy show -a frankshare -k YoMjXMDe+694FGgOaN0oPaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQVxsqhWvGrWnWj5m4X1U0HIPUIAA== imagesbackup 20151011_151451.MOV



Azure CLI 2.0 (Windows, Linux, OS X, Docker, Cloud Shell)

The Azure CLI 2.0 is Azure's new command-line optimized for managing and administering Azure resources that work against the Azure Resource Manager. Like the previous version, it will work perfectly on Windows, Linux, OS X, Docker but also from the Cloud Shell!

Cloud Shell is available right from the Azure Portal, without any plugging.

Uploading to Azure

The command if the same as the previous version except that now the command is named az. Here an example to upload a single file into an Azure Blob Storage.

az storage blob upload --file /home/frank/demo/CloudIsStrong.jpg \
--account-name frankysnotes \
--container-name blogimages --name CloudIsStrong.jpg \
--account-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA==

Copying Between Subscriptions

Let's now copy the file to another Azure subscription. A think to be aware is that --account-name and --account-key are for the destination, even if it's not specified.
az storage blob copy start \
--source-account-name frankysnotes  --source-account-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== \
--source-container blogimages --source-blob CloudIsStrong.jpg   \
--account-name frankshare  --account-key EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== \
--destination-container imagesbackup  \
--destination-blob CloudIsStrong.jpg 

In Video Please!

If you prefer, I also have a video version of that post.

One More Thing

Sometimes, we don't need to script things, and a graphic interface is much better. For this kind of situation, the must is the Azure Storage Explorer. It does a lot! Upload, download, and manage blobs, files, queues, tables, and Cosmos DB entities. And it works on Windows, macOS, and Linux!

It's just the beginning

This post was just an introduction to two very powerful tools. I strongly suggest to go read in the official documentation to learn more. Use the comment to share all your questions and suggestion.


How to save huge money by shutting down your VM automatically

Updated on 2018-03-14

Virtual machines (VM) are used in most solutions nowadays as a [ProcessName] server, temporary machine to run tests or make demos, and sometimes even as a development machine. One of the great benefits of the cloud is that you only pay for what you use. So unlike the old server, that you keep paying for, you won pay virtual machine's CPU for when you turned off! In this post, I explain how to do it with your existing machines and also what to do with all the future one that you will be creating.

(Ce billet en aussi disponible en français.)

Already have a VM up and running, here what to do

From the Azure portal (, select the Virtual Machine (VM) that you which to edit. Then look at the option panel, on the left, for Auto-Shutdown in the Operations section. You should have something that looks like this:


At any time you can enable and disable that functionality, it won’t affect the running VM.

Now, to activate it click on the Enabled. Then Select the time you would like to see the VM shutdown. Be sure to select the good time zone, by default it’s UTC. You can adjust the at for UTC of change the time zone, both options are valid.

Now you could decide to enable the notification. That could be useful if you may want to postpone the shutdown for one or two hours, or integrate the shutdown to another process like backup, cleaning…

To activate the notification option just click on the enabled, and enter the email address. If you want to attach the shutdown to a Logic App or an Azure Functions use the webhook. Here an example of notification email, see the Postpone options link.


What if you have many VMs running

Let's say you have already twenty (or more) VMs running, you could have executed a PowerShell script like:

$myMVsName = @("franDev1", "frankBuildserver", "demo_sales2018")

For ($i=0; $i -lt $myMVsName.Length; $i++) {     
    Set-AzureRmDtlAutoShutdownPolicy $myMVsName[$i]

Update - 2018-03-14
Well, today this is only possible for VM part of a DevTest Labs. Not for "regular" VM. However, I'm sure that day will come pretty quick.Does that mean that you need to go in all your VMs and set it manually? No. You can use an Azure Automation that will stop a list of VM on a regular schedule. A big advantage of this solution is that you can be more creative since it offers a lot more flexibility. You could identify the VM to shutdown base on some TAGS, you could have a different schedule base on the week vs weekend. You could even have a task to start VMs in the morning... More to come on that topic in a future post... If you want to read about how to get started to Azure Automation click here.

Multiple VMs that already exist, no problem

Obviously, if you have multiple virtual machines that already exist it is not very efficient to change their configuration one by one via the portal. Here is a small script to change the configuration of a large amount of VM in one shot.

    '# Login-AzureRmAccount

    $Subscription = Get-AzureRmSubscription -SubscriptionName 'YOUR_SUBSCRIPTION_NAME'
    Select-AzureRmSubscription -Subscription $Subscription.Id

    $selectedVMs = Get-Azurermvm -ResourceGroupName cloud5mins
    foreach($vm in $selectedVMs) 
        $ResourceGroup = $vm.ResourceGroupName
        $vmName = $vm.Name
        $ScheduledShutdownResourceId = "/subscriptions/$Subscription/resourceGroups/$ResourceGroup/providers/microsoft.devtestlab/schedules/shutdown-computevm-$vmName"
        $Properties = @{}
        $Properties.Add('status', 'Enabled')
        $Properties.Add('targetResourceId', $vm.Id)
        $Properties.Add('taskType', 'ComputeVmShutdownTask')
        $Properties.Add('dailyRecurrence', @{'time'= 2100})
        $Properties.Add('timeZoneId', 'Eastern Standard Time')
        $Properties.Add('notificationSettings', @{status='Disabled'; timeInMinutes=60})

        New-AzureRmResource -Location $vm.Location -ResourceId $ScheduledShutdownResourceId -Properties $Properties -Force

The variable $selectedVMs contains all the VMS that we wish to edit. In this sample, I only get VMs contained in the RessourceGroup cloud5mins, but there are no limits to what you can do. You could select all VMs with a specific OS, tags, location, name, etc.

The variable $ScheduledShutdownResourceId will be the identity for the configuration for the auto-shutdown we wish to inject. Note that the provider is microsoft.devtestlab.

Next, we create a collection of properties in $Properties. status the one that active or deactivate the auto-shutdonw. targetResourceId is the resourceID of the VM we target.

The only things left is to specify the time and timezone.

If you prefer, I also have a video version that explains all the steps.

How to shutdown automatically all your existing VMs

End Update

Let's create a VM with the auto-shutdown pre-configured with ARM

Of course, a much more efficient way to set the auto-shutdown is at the creation time by adding a new resource of type Microsoft.DevTestLab/schedules to your template. This option was previously only accessible for DevTestLab, but recently was made available to any VMs.
Here an example of the variables that could be added to your template.

"variables": {

    "ShutdowTime": "21:00",
    "TimeZone": "UTC",
    "emailRecipient": "",
    "notificationLocale": "en",
    "timeInMinutes": 30

And here an example of Microsoft.DevTestLab/schedules resource. One of these should be added for every VM you wish to auto-shutdown. Because your script is for one server, however, only one instance is required.

    "name": "[concat('autoshutdown-', variables('vmName'))]",
    "type": "Microsoft.DevTestLab/schedules",
    "apiVersion": "2017-04-26-preview",
    "location": "[resourceGroup().location]",
    "properties": {
        "status": "Enabled",
        "taskType": "ComputeVmShutdownTask",
        "dailyRecurrence": {
            "time": "[variables('ShutdowTime')]"
        "timeZoneId": "[variables('TimeZone')]",
        "targetResourceId": "[resourceId('Microsoft.Compute/virtualMachines', variables('vmName'))]",
        "notificationSettings": {
            "status": "Enabled",
            "emailRecipient": "[variables('emailRecipient')]",
            "notificationLocale": "[variables('notificationLocale')]",
            "timeInMinutes": "[variables('timeInMinutes')]"
    "dependsOn": [
        "[concat('Microsoft.Compute/virtualMachines/', variables('vmName'))]"

How to Automatically Generate Video Sub-Title in Another Language

I recently started a French YouTube channel. Quickly, I got a message asking to add English sub-title, and got also a suggestion to leverage Azure Logic App and some Cognitive Services to help me in that task. I really liked the idea, so I gave it a shot. I recorded myself and in twenty minutes I was done. Even though, it was not the success I was hoping for, the application works perfectly. It's just that speaking in French with a lot of English technical word was a little bite too hard for the Video Indexer. However, If you are speaking only one language in your video that solution would work perfectly. In this post, I will show you how to create that Logic App with Azure Video Indexer and Cognitive Services.

The Idea

Once a video is dropped in an OneDrive folder (or any file system accessible from Azure), a Logic App will get triggered and uploads the file to the Azure Video Indexer, generate a Video Text Tracks (VTT) file, and save this new file in another folder. A second Logic App will get started and use the Translator Text API from Azure Cognitive Service to translate the VTT file, and save it into the final folder.


The Generation

Before getting started, you will need to create your Video Indexer API. To do this, login to the Video Indexer developer portal, and subscribe at the Video Indexer APIs - Production in the Product tab. You should then get your API keys.


To get more detail on the subscription refer to the documentation. To know the names, parameters, code sample to all the methods available in your new API, click on APIs tab.


Now let's create our first Logic App. I always prefer to start with a blank template, but take what fits you. Any Online file system's trigger will do, in this case I'm using the When a file is created from OneDrive. I got some issue with the trigger. It was not always getting fired by a new file. I tried the When a file is modified trigger, but it didn't solve the problem. If you think, you know what I was doing wrong feel free to leave a comment :).

First reel action is to upload the file to the Azure Video Indexer. We can to that ery easily by using the method Upload video and and index, passing the name and content from the trigger.

Of course, the longer is the video the longer will be the process, so we will need to wait. A way to do that is by adding a waiting loop. Will use the method Get processing state from the Video Indexer and loop until the status is processed. To slow down your loop just add a wait action and set it at tree or five minutes.

When the file is completely processed, it will be time to retrieve the VTT file. This is done in two simple step. First, we will get the URL by calling the method Get the transcript URL, then with a simple HTTP GET we will download the file. The last thing we will need to do will be to save it in a folder where our second Logic App will be watching for new drop.

In the visual designer, the Logic App should look to this.


The Translation

The second Logic App is very short. Once again, it will get triggered by a new file trigger in our OneDrive Folder. Then it will be time to call our Translator Text API from Azure Cognitive Service. That's to the great Logic App interface it's very intuitive to fill all the parameter for our call. Once we got the translation, we need to save it into our final destination.

The Logic App should look like this.



It was much easier than I expected. I really like implementing those integration projects with Logic App. It's so easy to "plug" all those APIs together with this interface. And yes like I mentioned in the introduction the result was not "great". I run test with video purely in English (even with my accent) or only in French (no mix) and the result was really good. So I think the problem is really the fact that I mix French and English. I could improve the Indexer by spending time providing files so the service could understand better my "Franglish". However, in twenty minutes, I'm really impressed by the way, in turned out. If you have idea on how to improve this solution, or if you have some questions, feel free to leave a comment. You can also watch my French YouTube video.

All the code is available online on Github - Cloud5VideoHelper.


How to Fix the [ERROR] Get-ChildItem when Deploying an Azure Resource Group in Visual Studio

Lately, I've been having some trouble when deploying from Visual Studio. First, I didn't care since I didn't have time to investigate and also because most of the time using PowerShell or Azure CLI. However, this issue was not usual of Visual Studio, so I decided to see what was the problem and try to fix it.

The Problem

In a solution, I added a simple Azure Resource Group deployment project just like this one.


Then when I try to right-click and do a Deploy...


I was having this error message:

 - The following parameter values will be used for this operation:
 - Build started.
 - Project "TestARMProject.deployproj" (StageArtifacts target(s)):
 - Project "TestARMProject.deployproj" (ContentFilesProjectOutputGroup target(s)):
 - Done building project "TestARMProject.deployproj".
 - Done building project "TestARMProject.deployproj".
 - Build succeeded.
 - Launching PowerShell script with the following command:
 - 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\Deploy-AzureResourceGroup.ps1' -StorageAccountName '' -ResourceGroupName 'TestARMProject' -ResourceGroupLocation 'eastus' -TemplateFile 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.json' -TemplateParametersFile 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.parameters.json' -ArtifactStagingDirectory '.' -DSCSourceFolder '.\DSC'
 - Account          : Frank Boucher
 - SubscriptionName : My Subscription
 - SubscriptionId   : xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 - TenantId         : xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 - Environment      : AzureCloud
 - VERBOSE: Performing the operation "Replacing resource group ..." on target "".
 - VERBOSE: 7:06:33 - Created resource group 'TestARMProject' in location 'eastus'
 - ResourceGroupName : TestARMProject
 - Location          : eastus
 - ProvisioningState : Succeeded
 - Tags              : 
 - TagsTable         : 
 - ResourceId        : /subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/TestARMProject
 - Get-ChildItem : Cannot find path 
 - 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.json' because it does not 
 - exist.
 - At D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\Deploy-AzureResourceGroup.ps1:108 
 - char:48
 - + ... RmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseNa ...
 - +                                       ~~~~~~~~~~~~~~~~~~~~~~~~~~~
 -     + CategoryInfo          : ObjectNotFound: (D:\Dev\local\Te...zuredeploy.json:String) [Get-ChildItem], ItemNotFound 
 -    Exception
 -     + FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.GetChildItemCommand
 - Deploying template using PowerShell script failed.
 - Tell us about your experience at
Apparently the script is failling with Get-ChildItem because my script is missing?! I looked in the folder D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject, and indeed the files are missing! Fixing this is in fact really simple fortunately.

The Solution

The problem is very simple when Visual Studio is building the project, it doen't copies the script files in the build folder (in this case bin\Debug\Staging\). In fact, Visual Studio is doing exactly as we are telling it. Let see the build command for those files. Right-click and select Properties (or Alt+Enter) while azuredeploy.json is selected.


See the Build Action is set at None change that to Content (for all the scripts). Save and Deploy again.



My new Youtube Channel

I've been blogging for about ten years with you now on this blog, and it's been a pleasure. More recently I started, quietly, a French blog named: Cloud en Français, feel free to have a look.
Today, I'm super excited to share with you my new project: A YouTube channel with original content published every week... in French! Every Wednesday I will be publishing a five-minute video to answer a problem or a question.

Come see me online, subscribe, ask questions...

Cloud en 5 minutes


Je blogue depuis une dizaine d'années avec vous sur ce blogue, et c'est toujours un plaisir. Plus récemment j'ai commencé, plus silencieusement, un blogue français nommé: Cloud en Français, n'hésitez pas à jeter un coup d'œil.

Aujourd'hui, je suis super excité de partager avec vous mon nouveau projet: une chaîne YouTube avec du contenu original publié chaque semaine... en français! Chaque mercredi, je vais publier une vidéo d'environ cinq minutes pour répondre à un problème ou une question.

Venez me voir en ligne, abonnez-vous, posez des questions ...

Cloud en 5 minutes

How to access an SQL Database from an Azure Function and Node.js

The other day, a friend asked me how he could add some functionality to an existing application without having access to the code. It the perfect case to demo some Azure Functions capability, so I jumped on the occasion. Because my friend is a Node.js developer on Linux, and I knew it was supported, I decided to try that combination. I know Node, but I'm definitely not and expert since I don't practice very often.

This post is my journey building that demo. I was out of my comfort zone, coding in Node and working on a Linux machine, but not that far... Because these days, you can "do some Azure" from anywhere.

The Goal

Coding an Azure Function that will connect to an SQL Database (it could be any data source). Using Node.js and tools available on Unbuntu.

Note: In this post, I will be using Visual Studio Code, but you could also create your function directly in the Azure Portal or from Visual Stusio.

Getting Started

If you are a regular reader of this blog, you know how I like Visual Studio Code. It's a great tool available on Mac Linux and Windows and gives you the opportunity to enjoy all its feature from anywhere feeling like if you were in your cozy and familiar environment. If VSCode is not already installed on your machine, go grap your free version on

Many extensions are available for VSCode, and one gives us the capability to code and deploy Azure Function. To install it, open VSCode and select the extension icon and search for Azure Function; it's the one with the yellow lighting and the blue angle brackets.


Create the Azure Function

To get started let's great an Azure Function project. By sure to be in the folder where you wish to create your Function App. Open the Command Pallette (Ctrl + Shift + p) and type Azure Function. Select Azure Functions: Create New Project. That will add some configuration files for the Functions App.

Now Let's create a Function. You could reopen again the Command Palette and search for Azure Function: Create Function, but let's use the UI this time. At the bottom left of the Explorer section, you should see a new section called AZURE FUNCTIONS. Click on the little lighting to Create a new Function.


After you specify the Function App name, the Azure subscription and other little essential, a new folder will be added in your folder structure, and the function is created. The code of our function is in the file Index.js. At the moment, of writing this post only Javascript is supported by the VSCode extension.

Open the file index.js and replace all its content by the following code.

var Connection = require('tedious').Connection;
var Request = require('tedious').Request
var TYPES = require('tedious').TYPES;

module.exports = function (context, myTimer) {

    var _currentData = {};

    var config = {
        userName: 'frankadmin',
        password: 'MyPassw0rd!',
        server: '',
        options: {encrypt: true, database: 'clouden5db'}

    var connection = new Connection(config);
    connection.on('connect', function(err) {

    function getPerformance() {

        request = new Request("SELECT 'Best' = MIN(FivekmTime), 'Average' = AVG(FivekmTime) FROM RunnerPerformance;", function(err) {
        if (err) {

        request.on('row', function(columns) {
            _currentData.Best = columns[0].value;
            _currentData.Average = columns[1].value;;

        request.on('requestCompleted', function () {

    function saveStatistic() {

        request = new Request("UPDATE Statistic SET BestTime=@best, AverageTime=@average;", function(err) {
         if (err) {
        request.addParameter('best', TYPES.Int, _currentData.Best);
        request.addParameter('average', TYPES.Int, _currentData.Average);
        request.on('row', function(columns) {
            columns.forEach(function(column) {
              if (column.value === null) {
              } else {
                context.log("Statistic Updated.");



The code just to demonstrate how to connect to an SQL Database and do not represent the best practices. At the top, we have some declaration the used the package tedious; I will get back to that later. A that, I've created a connection using the configuration declared just before. Then we hook some function to some event. On connection connect the function getPerformance() is called to fetch the data.

On request row event we grab the data and do the "math", then finally on requestCompleted we call the second sub-function that will update the database with the new value. To get more information and see more example about tedious, check the GitHub repository.

Publish to Azure

All the code is ready; it's now time to publish our function to Azure. One more time you could to that by the Command Palette, or the Extension menu. Use the method of your choice and select Deploy to Function App. After a few seconds only our Function will be deployed in Azure.

Navigate to and get to your Function App. If you try to Run the Function right now, you will get an error because tedious is not recognized.

Install the dependencies

We need to install the dependencies for the Function App, in this case tedious. A very simple way is to create a package.json file and to use the Kudu console ton install it. Create a package.json file with the following json in it:

    "name": "CloudEn5Minutes",
    "version": "1.0.0",
    "description": "Connect to Database",
    "repository": {
       "type": "git",
       "url": "git+"
    "author": "",
    "license": "ISC",
    "dependencies": {
        "tedious": "^2.1.1"

Open the Kudu interface. You can reach it by clicking on the Function App then the tab Platform features and finally Advanced tools (Kudu). Kudu is also available directly by the URL [FunctionAppNAme] (ex: ). Select the Debug console CMD. Than in the top section navigate to the folder home\site\wwwroot. Drag & drop the package.json file. Once the file is uploaded, type the command npm install to download and install all the dependencies declared in our file. Once it all done you should restart the Function App.


Wrapping up & my thoughts

There it is, if you go back now to your Function and try to execute it will work perfectly. It's true that I'm familiar with Azure Function and SQL Database. However, for a first experience using Ubuntu and Node.js in the mix, I was expecting more resistance. One more time VSCode was really useful and everything was done with ease.

For those of you that would like to test this exact function, here the SQL code to generate what will be required for the database side.

    CREATE TABLE RunnerPerformance(
        Id           INT IDENTITY(1,1)  PRIMARY KEY,
        FivekmTime   INT

    CREATE TABLE Statistic(
        Id          INT IDENTITY(1,1)  PRIMARY KEY,
        BestTime    INT,
        AverageTime INT

    INSERT Statistic (BestTime, AverageTime) VALUES (1, 1);

    DECLARE @cnt INT = 0;

    WHILE @cnt < 10
    INSERT INTO RunnerPerformance (FivekmTime)
    SET @cnt = @cnt + 1;

Video version


Lessons Learned with Power Bi and Dynamic DAX Expressions

Since its availability, I try to use Power Bi as often as I can. It is so convenient, to build visual to explain data coming from a ton of possible data source. And it's a breeze to share it with clients, or colleagues. However, this post is not an info-commercial about Power Bi, it's about sharing some challenges I got trying to prepare a report and how I fix it.

The Data Source

The context is simple, all transactions are in one table, and I have a second table with a little information related to clients. To that, I personally like to add a calendar table, because it simplifies my life.


For this report, it is very important to but a slicer by Client.

The Goal

I needed to have one report that shows for every customer three different Year To Date (YTD) total. The classic YTO, a YTD but when the beginning of the years is, in fact, when the client started is enrolment, and the last one was a rolling twelve.
It looks pretty simple, and in fact, it's not that complicated. Let's examine each total formula one by one.

Classic Year To Date Total

Before we get started, it's a good practice to reuse Mesure to simplify our formula, and to explicit expression. Let's create a Measure for the Total Sales, that will be used inside our other formulas.

TotalSales = SUM('Sales'[Total])

Now the Year To Date, is simple to add by adding a New Measure and entering the formula:

YDTClassic = TOTALYDT([TotalSales], 'Calendar'[Date])

If you activate the Preview feature of Power Bi, it could be even easier. Look for the button New Quick Measure and select the Total Year To Date, fill up the form and voila!


The generated formula looks a bit different because Power Bi managed the error in h expression.

TotalYTD = 
    ERROR("Time intelligence quick measures can only be grouped or filtered by the Power BI-provided date hierarchy."),
    TOTALYDT([TotalSales], 'Calendar'[Date].[Date])

Anniversary Year To Date Total

I spent more time then I was expecting on that one. Because in the Online DAX documentation it is said that the formula TOTALYDT accept a third parameter to specify the end of the year. So if I had only one client, with a fix anniversary date (or a fiscal year) this formula will work assuming, the special date is April 30th.
TOTALYDT([TotalSales], 'Calendar'[Date], "04-30")
However, in my case, the ending date changes at with every client. I'm sure right now you are thinking that's easy Frank just set a variable and that it! Well, it won't work. The thing is the formula is expecting a static literal, no variable aloud even if it returns a string.
The workaround looks at first a bit hard, but it's not that complex. We need to write our own YTD formula. Let's look at the code, and I will explain it after.

Anniversary YTD = 
VAR enddingDate =   LASTDATE(Company[EnrolmentDate])
VAR enddingMonth =  MONTH ( enddingDate )
VAR enddingDay =    DAY ( enddingDate )
VAR currentDate =   MAX ( Calendar[Date] )
VAR currentYear =   YEAR ( currentDate )
VAR enddingThisYear =   DATE ( currentYear, enddingMonth, enddingDay )
VAR enddingLastYear =   DATE ( currentYear - 1, enddingMonth, enddingDay )
VAR enddingSelected =   IF ( enddingThisYear < currentDate, enddingThisYear, enddingLastYear )
        [TotalSales] ,    
First lines are all variable's declaration. They are not required, but I found it easier to understand when things are very explicit. Since I'm slicing my report by companies putting the LASTDATE is just a way not avoid errors. It should have only one record. Then we extract year, month, and day.
The last variable enddingSelected identify if the anniversary (the end date) is pasted or not in the curent calendar year.
The calculate function is returning the TotalSales between the last anniversary date and today.

Rolling twelve Total

For the last formula, the rolling twelve we will re-use the previous code, but in a simpler way since the end date is always yesterday.

Rolling 12 Total = 
VAR todayDate = TODAY()
VAR todayMonth =    MONTH ( todayDate )
VAR todayDay =  DAY ( todayDate ) 
VAR todayYear = YEAR ( todayDate ) 
VAR enddingLastYear =   DATE ( todayYear - 1, todayMonth, TodayDayVar +1) 
        [TotalSales] ,
        DATESBETWEEN( Calendar[Date], enddingLastYear, todayDate)   

Wrap it up

I definitely learned a few things with that Power Bi session, but it turns out to be pretty easy. Again, leave a comment or send me an email if you have any comments or questions I will be very happy to ear from you.