Showing posts with label devops. Show all posts
Showing posts with label devops. Show all posts

Reading Notes #380

Suggestion of the week

Cloud

Programming

Miscellaneous


~

Reading Notes #374

Cloud


Programming


Podcast

  • Hevesh5 - Making a YouTube Career from Viral Domino Art (#46) (That Creative Life) - Great show. An amazing story.
  • Azure Functions using Node with Simona Cotin (.NET Rocks!) - Great show. I just switch my website following that Jam stack pattern. I was planning to use Azure Functions to add a few little twists.... I'm happy to see that I not alone thinking like that!
  • 0230 - Alain Vezina - Le métier du DevOps (Visual Studio Talk Show) - Super épisode, très intéressant d'entendre parler du rôle de DevOps de quelqu'un qui le vie au quotidien. Merci de la suggestion, je crois, bien que je suis du pour relire The Pheonix Project.
  • Goal Setting Tips & Tracking KPIs (Video Pursuit Podcast) - Really interesting episode. Everybody is talking about matrix and KPI... But it's not frequent to hear about the "how". I really like how the goals are explained, achievable, but not easy... And how we should react when we don't reach them.

Miscellaneous


~ Good week!

Reading Notes #372

Suggestion of the week

Cloud

Programming


Miscellaneous

Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline

Static websites are lightning fast, and running them inside an Azure Blob Storage instead of a WebApp is incredibly economical (less than $1/ month). Does it mean you need to do everything manually? Absolutely not! In a previous post I explained how to automatically generated your static website using a Build Pipeline inside Azure DevOps. In this post, let's complete the CI-CD by creating a Release Pipeline to deploy it.

The Azure Resource Manager (ARM) Template


First thing first. If we want our release pipeline to deploy our website in Azure, we need first to be sure our Resources are available "up there." The best way to do this is by using an Azure Resource Manager (ARM template). I will use the same project started in the previous post, feel free to adapt to your structure or copy it from it.

Create a new file named deploy.json in the deployment folder. We need a simple storage account.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "StorageName": {
            "type":"string",
            "defaultValue": "cloudenfrancaisv2",
            "maxLength": 24
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "[parameters('StorageName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "[parameters('StorageName')]"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        }
    ],
    "outputs": {}
}

I used a parameter (StorageName) to define the name of the storage account. This way I could have multiple pipelines deploying in different storages.

Not to make the ARM template accessible to the release pipeline we also need to publish it. The easiest way to do it is to add another Copyfile task in our azure-pipeline. Add this task just before the PublishBuildArtifacts.

- task: CopyFiles@2
displayName: 'Copy deployment content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/deployment'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/deployment
    cleanTargetFolder: true

Once you commit and push these changes, it will trigger a build. When done, the ARM template will be available, and we will be able to start working on the release pipeline.

The Release Pipeline


Navigate to the DevOps project created in the previous post. This time, create a new Release Pipeline. When asked, select an empty template, we will pick manually the tasks we need.

First, we need to define the trigger and where are our artifacts. Click on the thing at the left of the screen. Select the build projects and let's use the latest version of the artifact to our deployment.

To get a continuous deployment, you need to enable it by clicking on the lightning bolt and selecting the enabled button.

Now let's select our tasks. Click on the "+" sign to add new tasks. We need three of these: Azure Resource Group Deployment, Azure CLI, and Azure File Copy.



Task 1 - Azure Resource Group Deployment


The first one will be an Azure Resource Group Deployment. The will be used to deploy our ARM template and be sure that the resources are available in Azure.

To configure the ARM deployment we need to select the Azure subscription and authorize the pipeline to have access. Then you will need to specify the name of the resource group you will be deploying into; it's location and finally points where is the linked ARM template.


Task 2 - Azure CLI


The second one is an Azure CLI. As I am writing this post, it's not possible to enable the static website property of a storage account. Therefore we will execute an Azure CLI command to change that configuration. Once you picked the Azure subscription, select inline script and enter this Azure CLI command:

az storage blob service-properties update --account-name wyamfrankdemo --static-website  --index-document index.html

This will enable the static website property of the storage account named wyamfrankdemo, and set the default document to index.html.

Task 3 - Azure File Copy


The last task is an Azure File Copy to copy all our files from $(System.DefaultWorkingDirectory)/drop/drop/outpout to the $web container (in our Azure Blob storage). The container must be named $web, that's the name used by Azure for the static website.

Wrapping up


Once you are done configuring the Release Pipeline, it's time to save it and run it. After only a minute or two (this demo is pretty small) the blog should be available into Azure. To find your endpoint (aka URL) you can go into the portal.azure.com and look at the static website property of the blob storage that we just create.

In a video, please!


I also have a video of this post if you prefer.





How to use Azure DevOps Pipeline and Cake to generate a static website

I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.

In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.

The Goal


The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.

The Static Website


In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.

For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.

output/

tool/
tools/

wwwroot/

config.wyam.dll
config.wyam.hash
config.wyam.packages.xml

Build Pipeline


The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.

The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.

On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:

#tool nuget:?package=Wyam&version=2.2.0
#addin nuget:?package=Cake.Wyam&version=2.1.3

var target = Argument("target", "Build");

Task("Build")
    .Does(() =>
    {
        Wyam( new WyamSettings {
            Recipe = "Blog",
            Theme = "CleanBlog"
        });        
    });

Task("Preview")
    .Does(() =>
    {
        Wyam(new WyamSettings
        {
            Recipe = "Blog",
            Theme = "CleanBlog",
            Preview = true,
            Watch = true
        });        
    });

RunTarget(target);

On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.

Now let's have a look at the azure-pipeline.yml file.

trigger:
- master

variables:
DOTNET_SDK_VERSION: '2.1.401'

pool:
vmImage: 'VS2017-Win2016'

steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core SDK $(DOTNET_SDK_VERSION)'
inputs:
    version: '$(DOTNET_SDK_VERSION)'

- powershell: ./build.ps1
displayName: 'Execute Cake PowerShell Bootstrapper'

- task: CopyFiles@2
displayName: 'Copy generated content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/output'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/outpout
    cleanTargetFolder: true

- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.

The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).

Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.

There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.

Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.

To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.



It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.

(Next post in the series - Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline)


In a video, please!

I also have a video of this post if you prefer.






References

Reading Notes #370

Cloud


Programming


Databases


Miscellaneous



Reading Notes #369

Cloud


Programming


Miscellaneous


~

Reading Notes #368

Reading Notes #368

Cloud


Programming


Miscellaneous


Books

Tribes: We Need You to Lead Us 

Author: Seth Godin 

A book that will polarize you. In my case, I really enjoyed it until the last page. I felt motivated, and it and was nice.

Reading Notes #367

Suggestion of the week


Cloud


Programming


Miscellaneous


Reading Notes #366

Cloud


Programming


Databases


Miscellaneous



Reading Notes #364

Cloud




    Programming



    Miscellaneous




    ~

      Reading Notes #363


      Cloud




      Programming




      Databases




      Miscellaneous


      ~


      Reading Notes #358

      CakeLogoCloud


      Programming


      Miscellaneous


      ~

      Reading Notes #354

      Cloud


      Programming


      Books

      Extreme Ownership_coverExtreme Ownership: How U.S. Navy SEALs Lead and Win (Jocko Willink, Leif Babin) - Very interesting book. Yes, it contains a lot of battle details, and first I was not sure, but then things "fall" all in place when you understand what the story was "demonstrating." It also contains more business focus examples. Everything is very clear, well explained in plain English.









      ~

      Reading Notes #350



      markdig

      Cloud


      Programming


      Miscellaneous


      Books

      Fast FocusFast Focus (Damon Zahariades) - Great short book. Not like the other of his kind, this book goes right to the point and offers actionable item. It's very practical and accessible to everyone. At the end of the book, you know what to do to get started and improve your focus.



      Let’s create a continuous integration and continuous deployment (CI-CD) with Azure DevOps

      I'm about to start a new project and want to have it with a continuous integration (CI) and continuous deployment (CD). I've been using VSTS for a while now but didn't have the chance to try the new pipelines. If you didn't know VSTS as been rebranded/ redefined as Azure Devops. Before going in with the real thing I decided to give it a try with a simple project. This post is to relay those first steps.

      Get Started


      Let's start by creating our Azure Devops project. Navigate to Dev.Azure.com and if you don't already have an account create one it's free! Once you are logged-in, create a new project by clicking the New project blue button in the top right corner.

      createNewProject

      You will need to provide a unique name and a few simple information.

      The Application


      First thing first, we need an application. For this post, I will be using a simple Asp.Net Core site. For the repository, we have options. AzureDevOps (ADOps) support many repository: GitHub, Bitbucket, private Git and its own. Because the project I've created is public I decided to keep the code at the same place as everything else.

      From the left menu, select Repos. From here if the code already exist just add a remote repository, or clone the empty one on your local machine, the usual. Create and add your code to that repository.

      Repos

      The Azure WebApp


      The next step is to create a placeholder for our CD pipeline. We will create an empty shell of a web application in Azure with these three Azure CLI commands. You can execute them locally or from the Cloud Shell. (Don't forget to validate that you are in the good subscription)
      az group create --name simplegroup --location eastus
      
      az appservice plan create --name simpleplan --resource-group simplegroup --sku FREE
      
      az webapp create --name simplefrankweb --resource-group simplegroup --plan simpleplan
      
      The first command will create a Resource group. Then inside of this group we create a service plan, and finally we create a webapp to the mix.

      Continuous Integration


      The goal is to have the code to get to compile at every commit. From the left menubar, select Pipelines, and click the create new button. The first step is to identify where our code is, as you can see Azure DevOps is flexible and accept code from outside.

      NewPipeline_step1

      Select the exact repository.

      NewPipeline_step2

      This third step displays the YAML code that defines your pipeline. At this point, the file is not complete, but it's enough to build, we will come back to it later. Click the Add button to add the azure-pipelines.yml file at the root level of your repository.

      NewPipeline_step3

      The build pipeline is ready click the Run button to execute it for the first time. Now at every commit, the build will be triggered. To see the status of your build just on to into the build section from the left menubar.

      buildSuccess

      Continuous Deployment


      Great, our code gets to compile at every commit. It would be nice if the code could also be automatically deployed into our dev environment. To achieve that we need to create a Release Pipeline. And our pipeline will need artifacts. We will edit the azure-pipelines.yml to add two new tasks. You can do this directly in the online repository or just from your local machine; remember the file is at the root. Add these commands:

      - task: DotNetCoreCLI@2
        displayName: 'dotnet publish $(buildConfiguration)'
        inputs:
          command: publish
          publishWebProjects: True
          arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'
          zipAfterPublish: True
      
      - task: PublishBuildArtifacts@1
        displayName: 'publish artifacts'
      

      Those two tasks are to publish our application (package it), and make it available in our Artifact folder. To learn more about the type of command available and see example have a look the excellent documentation at: https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core. Once you are done, save and commit (and push if it was local).

      From the left menubar, click on e the Pipeles, select Release, and clienk the New Release blue button. Select the template that matches your application. For this post Azure App Service deployment is the one we need.

      NewRelease_step1
      The next thing to will be to rename the environment for something else than Stage 1, I named mine "to Azure" but it could be dev, prod or anything that make sense for you. Click on the Add an Artifact button.

      ReleasePipeline

      You will now specify to the pipeline were to pick the artifacts it will deploy. In this case, we want the "output" of our latest build. And I renamed the Source alias as Drop.

      AddArtifact

      To get our continuous deployment (CD) we need to enable that trigger by clicking on the little lightning bolt and enabled it.

      TriggerRelease

      The last step to configure the Release pipeline is to specify a destination. By clicking on the "1 job, 1 task" in the middle of the screen (with the little red exclamation point in a circle), that will open the window where we will do that.

      Select the subscription you would like to use, and then click on the Authaurize button on the right. Once it's done go change the App Service Name. Click on it and wait 2-3 seconds you should see the app we created with our Azure CLI display. Select it, and voila!

      SetupReleaseDetails

      Now add a ReadMe.md file by checking out the code on your local machine or directly in Azure DevOps. Grab a badge from the build and/or release and copy paste it in the ReadMe. To get the code snippet of your badge, go to your build/ release definition, and click the ellipse button. Select Status badge and copy the snippet that matches your destination file (in our case the Markdown).

      GetaBadge_2

      Now when you go to the Overview page, you will have a nice badge that informed you. It also works on any web page just use the HTML snippet instead.

      simple_frank

      In a video, please!


      I also have a video of this post if you prefer.




      References:



      Reading Notes #346

      Cloud

      IMG_20181006_093540 (1)

      Programming


      Miscellaneous

      • Microsoft Ignite Aftermath ( Chris Pietschmann, Dan Patrick) - If you are like me and need to catch up on what append to Ignite, this post is a really good place to start as it contains a list of all the links we need.

      ~

      Reading Notes #345

      DSCF1554Suggestion of the week



      Cloud



      Programming



      Miscellaneous

      ~

      How to Create Your Custom Artifacts for DevTest Labs

      Most of the time when we use an Azure Devtest Lab it to Test our own application. This means that will need to install them on the virtual machines, every time. To do that, we need to create a custom artifact and add it to our formulas or to our claimable VMs. Lucky for us, creating a custom artifact is much easier than you may think. In fact, this post I will show you how easy it can be.

      Goal

      I want to create an artifact available from a private repository (Git from dev.azure.com in this case) that will set the timezone inside the VM.

      Getting started

      First, let's use a section in the Azure portal that is very useful; the Get Started section. In the portal navigate to your DevTest Lab (1), and select the Getting Started option from the left menu bar (2). In this new bar scroll down to the Lear more area and select Sample artifacts and scripts (3).

      GetStarted
      That will open the DevTestLab artifacts, scripts and samples project from Azure on Github. Open the folder Artifact, to see the list of all the usual artifacts you find in the public repo that is available by default in the portal.

      Notice how all artifacts are in their own folder. When you create a new artifact, you can always come here and pick something similar to what you are trying to do. This way, you won't start from scratch. Let's open windows-vsts-download-and-run-script. An artifact is defined in the file Artifactfile.json. This file is mandatory and cannot be renamed. You can put scripts, images, or anything else you need inside this folder.

      Open the Artifactfile.json file and have a look.

      ArtifactfileSample
      As you can see it's a simple JSON file. In the section (A) you will define the title, description, publisher, OS and the Icon. Note that the Icon must be accessible publicly, it could be on github, a blob storage or on a website. Section (B) is to define all the parameters you may need to install your artifact on the VM. Finally In (C) it's the command to execute.

      Create the Artifact

      Here is the JSON for our windows-Set-TimeZone artifact. A made it very static by not passing any parameter, but in a reel situation, a timezone parameter would be better.
      Artifactfile.json
      {
          "$schema": "https://raw.githubusercontent.com/Azure/azure-devtestlab/master/schemas/2016-11-28/dtlArtifacts.json",
          "title": "Set TimeZone to Eastern Standard Time",
          "description": "Execute tzutil command on the VM set set the Time Zone",
          "publisher": "FBoucher",
          "tags": [
              "PowerShell"
          ],
          "iconUri": "https://raw.githubusercontent.com/Azure/azure-devtestlab/master/Artifacts/windows-run-powershell/powershell.png",
          "targetOsType": "Windows",
          "parameters": { },
          "runCommand": {
              "commandToExecute": "tzutil.exe /s \"Eastern Standard Time\""
          }
      }

      Create an artifact repository

      For this post, I'm using Git from Azure Devops (dev.azure.com) previously named VSTS, but any private repository should works. If it's not already done create a project and go to the Repos section. Create a root folder named Artifacts or something else if you prefer. Then add a new folder for your artifact. To follow the best practices you should start with the name of your artifact by the name of the targeted OS; in my case windows-Set-TimeZone. Now add the file Artifactfile.json defined previously.

      Note the url of the repository, it should be easy to get it by click on the Clone button that is on the top right of the screen.

      clone

      Add repository to DevTest Lab

      Now we need to add this repository to our Devtest Labs. From the portal.azure.com, open the blade of your lab. From the left panel, click on Repository, then click the Add button.

      CreateRepo
      It's time to use the information noted previously. This is about the Repository, not the artifact.

      Use the Artifact

      The only thing left is to use our artifact. You can find it while creating a VM or a formula. When you have parameters define in your Artifactfile.json, the parameters will be listed in a form completly a the left.
      artifactList
      And if you try it, you will see that the time match the desired timezone. Here my PC is set to display with a format of 24H put it's the same... yep I'm in Eastern Standard Time.

      voila

      Add it to an ARM template

      Doing it with the nice interface is good when you are learning. However, we all know that no DevOps will do that manually every time. So let's add our Repository to our ARM template. If you need more detail on the deployment method, I explain it in a previous post How to be efficient with our Azure Devtest Lab deployments.

      When you don't know the type or the structure of a resource, you can always go in the Resource Explorer (resources.azure.com) there will be able to find your resource and see how it's defined.

      ResourceExplorer
      So for this post our artifactsources will look like this:
      {
          "properties": {
              "displayName": "Cloud5mins",
              "uri": "https://fboucher.visualstudio.com/DefaultCollection/Cloud5minsArtifacts/_git/Cloud5minsArtifacts",
              "sourceType": "VsoGit",
              "folderPath": "/Artifacts",
              "armTemplateFolderPath": "",
              "branchRef": "master",
              "securityToken": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
              "status": "Enabled"
          }, 
          "name": "Cloud5minsRepo",
          "type": "Microsoft.DevTestLab/labs/artifactsources"
      }
      An artifactsources goes in the Resources list inside the Devtest Labs.

      ARM
      In an ARM template you have the main node Resources (A), then you will have the Lab node (B). Inside this node, you should see second resources list (C), where the Virtual Network is defined. The artifactsources should go there.

      Then when you declare your formula, you just need to reference this repository, exactly like the public one.

      In a video, please!

      I also have a video of this post if you prefer.



      Reference:

      How to be efficient with our Azure Devtest Lab deployments

      (Ce billet est en aussi disponible en français.)

      The Devtest labs is a fantastic tool to quickly build environments for development & test purposes and for a classroom. It offers great tools to restrict the users without removing all their freedom. It will speed up the boarding, with its claimable VMs that are already created and are waiting for the user. Formulas will help ensure you that you always get the latest version of your artifact installed on those VMs. And finally, the auto-shutdown will keep your money where it should stay...in your pocket.


      In this post, I will show you how to deploy an Azure Devtest Lab with an Azure Resource Manager (ARM) template, and create the claimable VMs based on your formulas in one shot.

      Step 1 - The ARM template


      First, we need an ARM template. You can start from scratch of course, but it may be a lot of work if you are just getting started. You can also pick one from GiHub and customize it.

      What I recommended, is to create a simple Azure Devtest Lab directly from the Azure portal. Once your lab is created, go in the Automation script option of the resourcegroup and copy/paste the ARM template in your favorite text editor.
      armTemplate
      Now you must clean it. If you don't already know it, use the 5 Simple Steps to Get a Clean ARM Template method, it an excellent way to get started.
      Once the template is clean we need to add a few things that didn't follow during the export. Usually, in an ARM template, you get one list named resources. However, a Devtest Lab also contains a list named resources but it's probably missing.
      {
          "parameters": {},
          "variables": {},
          "resources": [],
      }
      See In the following example, I added the labs resources list just after the lab's location. This list must contain a virtualnetworks. It's also a good idea to add a schedules and a notificationChannels. Those two will be used to shut down automatically all the VMs and to send a notification to the user just before.

      {
          "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
          "contentVersion": "1.0.0.0",
          "parameters": {
              ...
          },
          "variables": {
              ...
          },
          "resources": [
              {
                  "type": "Microsoft.DevTestLab/labs",
                  "name": "[variables('LabName')]",
                  "apiVersion": "2016-05-15",
                  "location": "[resourceGroup().location]",
                  "resources": [
                      {
                          "apiVersion": "2017-04-26-preview",
                          "name": "[variables('virtualNetworksName')]",
                          "type": "virtualnetworks",
                          "dependsOn": [
                              "[resourceId('microsoft.devtestlab/labs', variables('LabName'))]"
                          ]
                      },
                      {
                          "apiVersion": "2017-04-26-preview",
                          "name": "LabVmsShutdown",
                          "type": "schedules",
                          "dependsOn": [
                              "[resourceId('Microsoft.DevTestLab/labs', variables('LabName'))]"
                          ],
                          "properties": {
                              "status": "Enabled",
                              "timeZoneId": "Eastern Standard Time",
                              "dailyRecurrence": {
                                  "time": "[variables('ShutdowTime')]"
                              },
                              "taskType": "LabVmsShutdownTask",
                              "notificationSettings": {
                                  "status": "Enabled",
                                  "timeInMinutes": 30
                              }
                          }
                      },
                      {
                          "apiVersion": "2017-04-26-preview",
                          "name": "AutoShutdown",
                          "type": "notificationChannels",
                          "properties": {
                              "description": "This option will send notifications to the specified webhook URL before auto-shutdown of virtual machines occurs.",
                              "events": [
                                  {
                                      "eventName": "Autoshutdown"
                                  }
                              ],
                              "emailRecipient": "[variables('emailRecipient')]"
                          },
                          "dependsOn": [
                              "[resourceId('Microsoft.DevTestLab/labs', variables('LabName'))]"
                          ]
                      }
                  ],
                  "dependsOn": []
              }
              ...

      Step 2 - The Formulas


      Now that the Devtest lab is well defined, it's time to add our formulas. If you had created some already from the portal, don't look for them in the template. At the moment, export won't script the formulas.

      A quick way to get the JSON of your formulas is to create them from the portal and then use Azure Resources Explorer to get the code.
      resourceExplorer
      In a web browser, navigate to https://resources.azure.com, to open your Resource Explorer. Select the subscription, resource group, and lab that you are working on. In the node Formulas (4) you should see your formulas, click one and let's bring that JSON into our ARM template. Copy-paste it at the Resource level (the prime one, not the one inside the Lab).

      Step 2.5 - The Azure KeyVault


      You shouldn't put any password inside your ARM template, however, having them pre-define inside the formulas is pretty convenient. One solution is to use an Azure KeyVault.

      Let's assume the KeyVault already exists, I will explain how to create it later. In your parameter file, add a parameter named adminPassword and let's reference the KeyVault. We also need to specify the secret we want to use. In this case, we will put the password in a secret named vmPassword.
          "adminPassword": {
              "reference": {
                  "keyVault": {
                      "id": "/subscriptions/{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}/resourceGroups/cloud5mins/providers/Microsoft.KeyVault/vaults/Cloud5minsVault"
                  },
                  "secretName": "vmPassword"
              }
          }
      Now to get the password in the ARM template just use a regular parameter, and voila!

      Step 3 - The ARM Claimable VMs


      Now we have a Lab and the formulas, the only thing missing is the claimable VM based on the formulas. It's impossible to create in one ARM template both formulas and VMs. The alternative is to use a script that will create our VMs just after the deployment.
      az group deployment create --name test-1 --resource-group cloud5mins --template-file DevTest.json --parameters DevTest.parameters.json --verbose
      
      az lab vm create --lab-name C5M-DevTestLab -g  cloud5mins --name FrankDevBox --formula SimpleDevBox  
      As you can see in the second Azure CLI command, we are creating a virtual machine named FrankDevBox based on the formula SimpleDevBox. Note that we don't need to specify any credential because everything was pre-defined in the formula. Pretty neat!

      Here a part of a script that will create if it doesn't exist a KeyVault and populate it. Then it will deploy our ARM template and finally, create our claimable VM. You can find all the code on my GitHub project: Azure-Devtest-Lab-efficient-deployment-sample.

      [...]
      
      # Checking for a KeyVault
      searchKeyVault=$(az keyvault list -g $resourceGroupName --query "[?name=='$keyvaultName'].name" -o tsv )
      lenResult=${#searchKeyVault}
      
      if [ ! $lenResult -gt 0 ] ;then
          echo "---> Creating keyvault: " $keyvaultName
          az keyvault create --name $keyvaultName --resource-group $resourceGroupName --location $resourceGroupLocation --enabled-for-template-deployment true
      else
          echo "---> The Keyvaul $keyvaultName already exists"
      fi
      
      
      echo "---> Populating KeyVault..."
      az keyvault secret set --vault-name $keyvaultName --name 'vmPassword' --value 'cr@zySheep42!'
      
      
      # Deploy the DevTest Lab
      
      echo "---> Deploying..."
      az group deployment create --name $deploymentName --resource-group $resourceGroupName --template-file $templateFilePath --parameters $parameterFilePath --verbose
      
      # Create the VMs using the formula created in the deployment
      
      labName=$(az resource list -g cloud5mins --resource-type "Microsoft.DevTestLab/labs" --query [*].[name] --output tsv)
      formulaName=$(az lab formula list -g $resourceGroupName  --lab-name $labName --query [*].[name] --output tsv)
      
      echo "---> Creating VM(s)..."
      az lab vm create --lab-name $labName -g  $resourceGroupName --name FrankSDevBox --formula $formulaName 
      echo "---> done <--- code="">

      In a video, please!


      I also have a video of this post if you prefer.



      Conclusion


      Would it be for developing, testing, or training, as soon as you are creating environments in Azure, the DevTest Labs are definitely a must. It's a very powerful tool that not enough people know. Give it a try and let me know what do you do with the Azure DevTest Lab?


      References:

      • Azure-Devtest-Lab-efficient-deployment-sample: https://github.com/FBoucher/Azure-Devtest-Lab-efficient-deployment-sample
      • An Overview of Azure DevTest Labs: https://www.youtube.com/watch?v=caO7AzOUxhQ
      • Best practices Using Azure Resource Manager (ARM) Templates: https://www.youtube.com/watch?v=myYTGsONrn0&t=7s
      • 5 Simple Steps to Get a Clean ARM Template: http://www.frankysnotes.com/2018/05/5-simple-steps-to-get-clean-arm-template.html



      ~