Showing posts sorted by date for query Deploy. Sort by relevance Show all posts
Showing posts sorted by date for query Deploy. Sort by relevance Show all posts

Be more Productive by using Inline Code in your Azure Logic App

In a project using Azure Logic Apps that I am working on, I needed to manipulate strings. I could create APIs or Azure Functions, but the code is very simple and is not using any external libraries. In this post, I will show you how to use the new Inline Code to execute your code snippet directly inside your Logic Apps.

Quick Context


The Logic App will read a file from my OneDrive (it will also work with DropBox, Box, etc.). Here an example of the file:

Nice tutorial that explains how to build, using postman, an efficient API.[cloud.azure.postman.tools]

The goal is to extract tags, contained between the square brackets, from the text.

Logic App: Get File Content


From the Azure Portal, create a new Logic App by clicking the big green "+" button in the top left corner and searching for Logic App.

For this demo, I will use the Interval as a trigger because I will execute the Logic App manually.

The first step will be a Get File Content action from the OneDrive connector. Once you authorized Azure to access your OneDrive folder, select the file you want to read. For me, it's /dev/simpleNote.txt

Integration Account


To access the workflowContext the Azure Logic App required an Integration account. Next step would be to create one. Save the current Logic App, and click on the big "+" button in the top right corner. This time search for integration. Select Integration Account, and complete the form to create it.


We now need to assign it to our Logic App. From the Logic App blade, in the options list select Workflow Settings. Then select your integration account, and don't forget to save!

Logic App: Inline Code


To add the action at the end of your workflow, click the New step button. Search for Inline Code, and select the action Execute JavaScript Code.


Before copy-pasting the code into the new Inline Code action let's have a quick look.

var note = "" + workflowContext.actions.Get_file_content.outputs.body;
var posTag = note.lastIndexOf("[") + 1;
var cleanNote = {};

if(posTag > 0){
        cleanNote.tags = note.substring(posTag, note.length-1);
        cleanNote.msg = note.substring(0,posTag-1);
    }
return cleanNote;

On the first line, we assign a variable note the content of the Get_file_content outputs. We access it using the workflowContext. This context has access to the trigger and the actions. To find the name of the action you can replace the spaces by the underscore character "_".


You can also switch to Code View, and see the name of all components from the JSON code.

Logic App: Use Inline Code Result


Of course, you can use the output of your Inline Code with other steps. You just need to use the Result from the dynamic content menu.


If for some reason the dynamic content list doesn't contain your Inline Code, you can always add the code directly @body('Cleaning_Note')?['body'].


Your Logic App should now look like this:


Verdict


The Inline code is very promising. Right now it's limited to JAvaScript and cannot access variable nor loops. However, for simple code that doesn't require any references, it's easier to maintain and deploy. You can learn more about what is exactly covered or not here.
And it works as this result shows.


You prefer watching instead or Reading


I also have a video of this post if you prefer.



References


Reading Notes #381

Suggestion of the week


Cloud


Programming


Miscellaneous


Books

Author: Eric Barker

Nice book. There is always a good story to make a correlation with his current point. Then it could go in a different direction with another story. All the stories are complementary and are adding layer by layer to the more complex message that is delivered to us. Easy to read, enjoyable from the beginning until the last word.

How to make your deployments successful every time

You are done with your code and you are ready to deploy it in Azure. You execute the PowerShell or Bash script you have and BOOM! The error message saying that this name is already taken. In this post, I will show you a simple way to look like a boss and make your deployment working all the time.

____ with given name ____ already exists.

The tricks other use


You could try to add a digit at the end of the resource name (ex: demo-app1, demo-app2, demo-app123...), but that’s not really professional. You could create a random string and append it to the name. Yes, that will works, once. If you are trying to redeploy your resources that value will change, therefore it will never be the same.
The solution would be to have a unique string that is constant in our environment.

The solution


The solution is to use the function UniqueString() part of the Azure Resource Manager (ARM) template. If we look in the documentation, UniqueString creates a deterministic hash string based on the values provided as parameters. Let’s see a quick example of an ARM template to deploy a website named demo-app.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {},
    "variables": {
        "webAppName": "demo-app"
    },
    "resources": [
        {
            "type": "Microsoft.Web/sites",
            "apiVersion": "2015-08-01",
            "name": "[variables('webAppName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/frankdemo-plan')]": "Resource",
                "displayName": "[variables('webAppName')]"
            },
            "dependsOn": [
                "Microsoft.Web/serverfarms/frankdemo-plan"
            ],
            "properties": {
                "name": "[variables('webAppName')]",
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms/', 'frankdemo-plan')]"
            }
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2016-09-01",
            "name": "frankdemo-plan",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "F1",
                "capacity": 1
            },
            "tags": {
                "displayName": "frankdemo-plan"
            },
            "properties": {
                "name": "frankdemo-plan"
            }
        }
    ],
    "outputs": {}
}

If you try to deploy this template, you will have an error because the name demo-app is already taken... no surprise here.

Let’s create a new variable suffix and we will use the Resource Group Id and Location as values. Then we just need to append this value to our name using the function concat().

    "variables": {
        "suffix": "[uniqueString(resourceGroup().id, resourceGroup().location)]",
        "webAppName": "[concat('demo-app', variables('suffix'))]"
    }

It’s that simple! Now every time you will deploy a unique string will be added to your resource name. That string will always be the same for a Resource Group-Location deployment.

Because some resource types are more restrictive than others you may need adapt your new name. Maybe the name of your resource plus those thirteen characters hash will be too long... No problem, you can easily make it shorter and all lower case just by using substring() and toLower().

 "parameters": {},
    "variables": {
        "suffix": "[substring(toLower(uniqueString(resourceGroup().id, resourceGroup().location)),0,5)]",
        "webAppName": "[concat('demo-app', variables('suffix'))]"
    }

Voila, and now by using ARM template you can deploy and redeploy without any problem reproducing the same solution you built. To learn move about the ARM template you can jump in the documentation, where you will find samples, step-by-step tutorials and more.


If you have a specific question about ARM templates or if you would like to see more tips like this one, don't hesitate to ask in the comments section or reach out on social media!

In a video, please!


I also have a video of this post if you prefer.




Image by StartupStockPhotos from Pixabay

Reading Notes #376

Cloud

Miscellaneous

Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline

Static websites are lightning fast, and running them inside an Azure Blob Storage instead of a WebApp is incredibly economical (less than $1/ month). Does it mean you need to do everything manually? Absolutely not! In a previous post I explained how to automatically generated your static website using a Build Pipeline inside Azure DevOps. In this post, let's complete the CI-CD by creating a Release Pipeline to deploy it.

The Azure Resource Manager (ARM) Template


First thing first. If we want our release pipeline to deploy our website in Azure, we need first to be sure our Resources are available "up there." The best way to do this is by using an Azure Resource Manager (ARM template). I will use the same project started in the previous post, feel free to adapt to your structure or copy it from it.

Create a new file named deploy.json in the deployment folder. We need a simple storage account.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "StorageName": {
            "type":"string",
            "defaultValue": "cloudenfrancaisv2",
            "maxLength": 24
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "[parameters('StorageName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "[parameters('StorageName')]"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        }
    ],
    "outputs": {}
}

I used a parameter (StorageName) to define the name of the storage account. This way I could have multiple pipelines deploying in different storages.

Not to make the ARM template accessible to the release pipeline we also need to publish it. The easiest way to do it is to add another Copyfile task in our azure-pipeline. Add this task just before the PublishBuildArtifacts.

- task: CopyFiles@2
displayName: 'Copy deployment content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/deployment'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/deployment
    cleanTargetFolder: true

Once you commit and push these changes, it will trigger a build. When done, the ARM template will be available, and we will be able to start working on the release pipeline.

The Release Pipeline


Navigate to the DevOps project created in the previous post. This time, create a new Release Pipeline. When asked, select an empty template, we will pick manually the tasks we need.

First, we need to define the trigger and where are our artifacts. Click on the thing at the left of the screen. Select the build projects and let's use the latest version of the artifact to our deployment.

To get a continuous deployment, you need to enable it by clicking on the lightning bolt and selecting the enabled button.

Now let's select our tasks. Click on the "+" sign to add new tasks. We need three of these: Azure Resource Group Deployment, Azure CLI, and Azure File Copy.



Task 1 - Azure Resource Group Deployment


The first one will be an Azure Resource Group Deployment. The will be used to deploy our ARM template and be sure that the resources are available in Azure.

To configure the ARM deployment we need to select the Azure subscription and authorize the pipeline to have access. Then you will need to specify the name of the resource group you will be deploying into; it's location and finally points where is the linked ARM template.


Task 2 - Azure CLI


The second one is an Azure CLI. As I am writing this post, it's not possible to enable the static website property of a storage account. Therefore we will execute an Azure CLI command to change that configuration. Once you picked the Azure subscription, select inline script and enter this Azure CLI command:

az storage blob service-properties update --account-name wyamfrankdemo --static-website  --index-document index.html

This will enable the static website property of the storage account named wyamfrankdemo, and set the default document to index.html.

Task 3 - Azure File Copy


The last task is an Azure File Copy to copy all our files from $(System.DefaultWorkingDirectory)/drop/drop/outpout to the $web container (in our Azure Blob storage). The container must be named $web, that's the name used by Azure for the static website.

Wrapping up


Once you are done configuring the Release Pipeline, it's time to save it and run it. After only a minute or two (this demo is pretty small) the blog should be available into Azure. To find your endpoint (aka URL) you can go into the portal.azure.com and look at the static website property of the blob storage that we just create.

In a video, please!


I also have a video of this post if you prefer.





How to use Azure DevOps Pipeline and Cake to generate a static website

I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.

In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.

The Goal


The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.

The Static Website


In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.

For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.

output/

tool/
tools/

wwwroot/

config.wyam.dll
config.wyam.hash
config.wyam.packages.xml

Build Pipeline


The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.

The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.

On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:

#tool nuget:?package=Wyam&version=2.2.0
#addin nuget:?package=Cake.Wyam&version=2.1.3

var target = Argument("target", "Build");

Task("Build")
    .Does(() =>
    {
        Wyam( new WyamSettings {
            Recipe = "Blog",
            Theme = "CleanBlog"
        });        
    });

Task("Preview")
    .Does(() =>
    {
        Wyam(new WyamSettings
        {
            Recipe = "Blog",
            Theme = "CleanBlog",
            Preview = true,
            Watch = true
        });        
    });

RunTarget(target);

On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.

Now let's have a look at the azure-pipeline.yml file.

trigger:
- master

variables:
DOTNET_SDK_VERSION: '2.1.401'

pool:
vmImage: 'VS2017-Win2016'

steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core SDK $(DOTNET_SDK_VERSION)'
inputs:
    version: '$(DOTNET_SDK_VERSION)'

- powershell: ./build.ps1
displayName: 'Execute Cake PowerShell Bootstrapper'

- task: CopyFiles@2
displayName: 'Copy generated content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/output'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/outpout
    cleanTargetFolder: true

- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.

The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).

Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.

There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.

Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.

To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.



It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.

(Next post in the series - Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline)


In a video, please!

I also have a video of this post if you prefer.






References

Reading Notes #370

Cloud


Programming


Databases


Miscellaneous



How to Unzip Automatically your Files with Azure Function v2

I published a video that explains how to UnZip files without any code by using Logic Apps. However, that solution didn't work for bigger files or different archive type. This post is to explain how you can use the Azure Function to cover those situations. This first iteration supports "Zip" files; all the code is available in my GitHub.

Prerequisites


To create the Azure Function, I will use the excellent Azure Function extension of Visual Studio Code. You don't "need" it. However, it makes thing very easy.


You can easily install the extension from Inside Visual Studio Code by clicking on the extension button in the left menu. You will also need to install the Azure Function Core Tools

Creating the Function


Once the extension installed, you will find a new button in the left menu. That opens a new section with four new option: Create New Project, Create Function, Deploy to Function App, and Refresh.


Click on the first option Create New Project. Select a local folder and a language; for this demo, I will use C#. This will create a few files and folder. Now let's create our Function. From the extension menu, select the second option Create Function. Create a Blob Trigger named UnzipThis into the folder we just created, and select (or create) Resource Group, Storage Account, and location in your subscription. After a few seconds, another question will pop asking the name of the container that our blob trigger monitors. For this demo, input-files is used.

Once the function is created you will see this warning message.


What that means is that to be able to debug locally we will need to set the setting AzureWebJobsStorage to UseDevelopmentStorage=true in the local.settings.json file. It will look like this.

{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "UseDevelopmentStorage=true",
        "FUNCTIONS_WORKER_RUNTIME": "dotnet",
        "unziptools_STORAGE": "DefaultEndpointsProtocol=https;AccountName=unziptools;AccountKey=XXXXXXXXX;EndpointSuffix=core.windows.net",
    }
}

Open the file UnzipThis.cs; this is our function. On the first line of the function, you can see that the Blob trigger is defined.

[BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]Stream myBlob

The binding is attached to the container named input-files, from the storage account reachable by the connection "cloud5mins_storage". The real connectionString is in the local.settings.json file.

Now, let's put the code we need for our demo:

[FunctionName("Unzipthis")]
public static async Task Run([BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]CloudBlockBlob myBlob, string name, ILogger log)
{
    log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name}");

    string destinationStorage = Environment.GetEnvironmentVariable("destinationStorage");
    string destinationContainer = Environment.GetEnvironmentVariable("destinationContainer");

    try{
        if(name.Split('.').Last().ToLower() == "zip"){

            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(destinationStorage);
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference(destinationContainer);
            
            using(MemoryStream blobMemStream = new MemoryStream()){

                await myBlob.DownloadToStreamAsync(blobMemStream);

                using(ZipArchive archive = new ZipArchive(blobMemStream))
                {
                    foreach (ZipArchiveEntry entry in archive.Entries)
                    {
                        log.LogInformation($"Now processing {entry.FullName}");

                        //Replace all NO digits, letters, or "-" by a "-" Azure storage is specific on valid characters
                        string valideName = Regex.Replace(entry.Name,@"[^a-zA-Z0-9\-]","-").ToLower();

                        CloudBlockBlob blockBlob = container.GetBlockBlobReference(valideName);
                        using (var fileStream = entry.Open())
                        {
                            await blockBlob.UploadFromStreamAsync(fileStream);
                        }
                    }
                }
            }
        }
    }
    catch(Exception ex){
        log.LogInformation($"Error! Something went wrong: {ex.Message}");

    }            
}

UPDATED: Thanks to Stefano Tedeschi who found a bug and suggested a fix.

The source of our compressed file is defined in the trigger. To define the destination destinationStorage and destinationContainer are used. Their value are saved into local.settings.json. Then because this function only supports .zip file a little validation was required.

Next, we create an archive instance using the new System.IO.Compression library. We then create references to the storage account, blob, and container. It not possible to used second binding here because for one archive file you have a variable number of potential extracted files. The bindings are static; therefore we need to use the regular storage API.

Then for every file (aka entry) in the archive the code upload it to the destination storage.

Deploying


To deploy the function, from the Azure Function extension click on the third option: Deploy to Function App. Select your subscription and Function App name.

Now we need to configure our settings in Azure. By default, the local.setting are NOT used. Once again the extension is handy.


Under the subscription expand the freshly deployed Function App AzUnzipEverything, and right-click on Application Settings. Use Add New Setting to create cloud5mins_storage, destinationStorage and destinationContainer.

The function is now deployed and the settings are set, now we only need to create the blob storage containers, and we will be able to test the function. You can easily do that directly from the Azure portal (portal.azure.com).

You are now ready to upload a file into the input-files container.

Let's Code Together


This first iteration only supports "Zip" files. All the code is available on GitHub. Feel free to use it. If you would like to see or add support for other archive types join me on GitHub!.

In a video, please!


I also have a video of this post if you prefer.



I also have an extended version where I introduce more the Visual Studio Extension to work with Azure Function. And explain more details about the Azure Function V2.

Reading Notes #356

IMG_20181128_122246Suggestion of the week

  • Security Headers (Tanya Janca) - Interesting post that shows the code/configuration we need to add, in order to get a more secure website.

Cloud


Programming


Miscellaneous


Books

fast_focus_coverFast Focus: A Quick-Start Guide To Mastering Your Attention, Ignoring Distractions, And Getting More Done In Less Time! (Damon Zahariades) - Great book well organized. Simple strike to the point. It is divided into three parts: understanding focus, creating an environment for focus, and employing tactics to focus. It lists the top 10 obstacles to staying focused and gives you a great idea on how to get start your journey.











~


Reading Notes #354

Cloud


Programming


Books

Extreme Ownership_coverExtreme Ownership: How U.S. Navy SEALs Lead and Win (Jocko Willink, Leif Babin) - Very interesting book. Yes, it contains a lot of battle details, and first I was not sure, but then things "fall" all in place when you understand what the story was "demonstrating." It also contains more business focus examples. Everything is very clear, well explained in plain English.









~

Reading Notes #353

aliasesSuggestion of the week



Cloud



Programming



Books

The_FIrst_20_HoursThe First 20 Hours: How to Learn Anything...Fast (Josh Kaufman) - That was a very interesting book. I devoured it much faster than I thought! As I was reading it, I was thinking... hey as a developer/programmer I already do a lot of those things... Things change so fast, technology changes... And a few pages later, the author was saying the same thing. :) It looks like we can adapt to many deferent situations. The author share with us a few journeys as he was learning new stuff. While I may not be interested to learn how to play to Go, I found all part of the book very interesting as those journeys a pact with tons of information.
ISBN: 1591845556 (ISBN13: 9781591845553)


Miscellaneous


~

Reading Notes #350



markdig

Cloud


Programming


Miscellaneous


Books

Fast FocusFast Focus (Damon Zahariades) - Great short book. Not like the other of his kind, this book goes right to the point and offers actionable item. It's very practical and accessible to everyone. At the end of the book, you know what to do to get started and improve your focus.



Let’s create a continuous integration and continuous deployment (CI-CD) with Azure DevOps

I'm about to start a new project and want to have it with a continuous integration (CI) and continuous deployment (CD). I've been using VSTS for a while now but didn't have the chance to try the new pipelines. If you didn't know VSTS as been rebranded/ redefined as Azure Devops. Before going in with the real thing I decided to give it a try with a simple project. This post is to relay those first steps.

Get Started


Let's start by creating our Azure Devops project. Navigate to Dev.Azure.com and if you don't already have an account create one it's free! Once you are logged-in, create a new project by clicking the New project blue button in the top right corner.

createNewProject

You will need to provide a unique name and a few simple information.

The Application


First thing first, we need an application. For this post, I will be using a simple Asp.Net Core site. For the repository, we have options. AzureDevOps (ADOps) support many repository: GitHub, Bitbucket, private Git and its own. Because the project I've created is public I decided to keep the code at the same place as everything else.

From the left menu, select Repos. From here if the code already exist just add a remote repository, or clone the empty one on your local machine, the usual. Create and add your code to that repository.

Repos

The Azure WebApp


The next step is to create a placeholder for our CD pipeline. We will create an empty shell of a web application in Azure with these three Azure CLI commands. You can execute them locally or from the Cloud Shell. (Don't forget to validate that you are in the good subscription)
az group create --name simplegroup --location eastus

az appservice plan create --name simpleplan --resource-group simplegroup --sku FREE

az webapp create --name simplefrankweb --resource-group simplegroup --plan simpleplan
The first command will create a Resource group. Then inside of this group we create a service plan, and finally we create a webapp to the mix.

Continuous Integration


The goal is to have the code to get to compile at every commit. From the left menubar, select Pipelines, and click the create new button. The first step is to identify where our code is, as you can see Azure DevOps is flexible and accept code from outside.

NewPipeline_step1

Select the exact repository.

NewPipeline_step2

This third step displays the YAML code that defines your pipeline. At this point, the file is not complete, but it's enough to build, we will come back to it later. Click the Add button to add the azure-pipelines.yml file at the root level of your repository.

NewPipeline_step3

The build pipeline is ready click the Run button to execute it for the first time. Now at every commit, the build will be triggered. To see the status of your build just on to into the build section from the left menubar.

buildSuccess

Continuous Deployment


Great, our code gets to compile at every commit. It would be nice if the code could also be automatically deployed into our dev environment. To achieve that we need to create a Release Pipeline. And our pipeline will need artifacts. We will edit the azure-pipelines.yml to add two new tasks. You can do this directly in the online repository or just from your local machine; remember the file is at the root. Add these commands:

- task: DotNetCoreCLI@2
  displayName: 'dotnet publish $(buildConfiguration)'
  inputs:
    command: publish
    publishWebProjects: True
    arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'
    zipAfterPublish: True

- task: PublishBuildArtifacts@1
  displayName: 'publish artifacts'

Those two tasks are to publish our application (package it), and make it available in our Artifact folder. To learn more about the type of command available and see example have a look the excellent documentation at: https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core. Once you are done, save and commit (and push if it was local).

From the left menubar, click on e the Pipeles, select Release, and clienk the New Release blue button. Select the template that matches your application. For this post Azure App Service deployment is the one we need.

NewRelease_step1
The next thing to will be to rename the environment for something else than Stage 1, I named mine "to Azure" but it could be dev, prod or anything that make sense for you. Click on the Add an Artifact button.

ReleasePipeline

You will now specify to the pipeline were to pick the artifacts it will deploy. In this case, we want the "output" of our latest build. And I renamed the Source alias as Drop.

AddArtifact

To get our continuous deployment (CD) we need to enable that trigger by clicking on the little lightning bolt and enabled it.

TriggerRelease

The last step to configure the Release pipeline is to specify a destination. By clicking on the "1 job, 1 task" in the middle of the screen (with the little red exclamation point in a circle), that will open the window where we will do that.

Select the subscription you would like to use, and then click on the Authaurize button on the right. Once it's done go change the App Service Name. Click on it and wait 2-3 seconds you should see the app we created with our Azure CLI display. Select it, and voila!

SetupReleaseDetails

Now add a ReadMe.md file by checking out the code on your local machine or directly in Azure DevOps. Grab a badge from the build and/or release and copy paste it in the ReadMe. To get the code snippet of your badge, go to your build/ release definition, and click the ellipse button. Select Status badge and copy the snippet that matches your destination file (in our case the Markdown).

GetaBadge_2

Now when you go to the Overview page, you will have a nice badge that informed you. It also works on any web page just use the HTML snippet instead.

simple_frank

In a video, please!


I also have a video of this post if you prefer.




References:



How to be efficient with our Azure Devtest Lab deployments

(Ce billet est en aussi disponible en français.)

The Devtest labs is a fantastic tool to quickly build environments for development & test purposes and for a classroom. It offers great tools to restrict the users without removing all their freedom. It will speed up the boarding, with its claimable VMs that are already created and are waiting for the user. Formulas will help ensure you that you always get the latest version of your artifact installed on those VMs. And finally, the auto-shutdown will keep your money where it should stay...in your pocket.


In this post, I will show you how to deploy an Azure Devtest Lab with an Azure Resource Manager (ARM) template, and create the claimable VMs based on your formulas in one shot.

Step 1 - The ARM template


First, we need an ARM template. You can start from scratch of course, but it may be a lot of work if you are just getting started. You can also pick one from GiHub and customize it.

What I recommended, is to create a simple Azure Devtest Lab directly from the Azure portal. Once your lab is created, go in the Automation script option of the resourcegroup and copy/paste the ARM template in your favorite text editor.
armTemplate
Now you must clean it. If you don't already know it, use the 5 Simple Steps to Get a Clean ARM Template method, it an excellent way to get started.
Once the template is clean we need to add a few things that didn't follow during the export. Usually, in an ARM template, you get one list named resources. However, a Devtest Lab also contains a list named resources but it's probably missing.
{
    "parameters": {},
    "variables": {},
    "resources": [],
}
See In the following example, I added the labs resources list just after the lab's location. This list must contain a virtualnetworks. It's also a good idea to add a schedules and a notificationChannels. Those two will be used to shut down automatically all the VMs and to send a notification to the user just before.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        ...
    },
    "variables": {
        ...
    },
    "resources": [
        {
            "type": "Microsoft.DevTestLab/labs",
            "name": "[variables('LabName')]",
            "apiVersion": "2016-05-15",
            "location": "[resourceGroup().location]",
            "resources": [
                {
                    "apiVersion": "2017-04-26-preview",
                    "name": "[variables('virtualNetworksName')]",
                    "type": "virtualnetworks",
                    "dependsOn": [
                        "[resourceId('microsoft.devtestlab/labs', variables('LabName'))]"
                    ]
                },
                {
                    "apiVersion": "2017-04-26-preview",
                    "name": "LabVmsShutdown",
                    "type": "schedules",
                    "dependsOn": [
                        "[resourceId('Microsoft.DevTestLab/labs', variables('LabName'))]"
                    ],
                    "properties": {
                        "status": "Enabled",
                        "timeZoneId": "Eastern Standard Time",
                        "dailyRecurrence": {
                            "time": "[variables('ShutdowTime')]"
                        },
                        "taskType": "LabVmsShutdownTask",
                        "notificationSettings": {
                            "status": "Enabled",
                            "timeInMinutes": 30
                        }
                    }
                },
                {
                    "apiVersion": "2017-04-26-preview",
                    "name": "AutoShutdown",
                    "type": "notificationChannels",
                    "properties": {
                        "description": "This option will send notifications to the specified webhook URL before auto-shutdown of virtual machines occurs.",
                        "events": [
                            {
                                "eventName": "Autoshutdown"
                            }
                        ],
                        "emailRecipient": "[variables('emailRecipient')]"
                    },
                    "dependsOn": [
                        "[resourceId('Microsoft.DevTestLab/labs', variables('LabName'))]"
                    ]
                }
            ],
            "dependsOn": []
        }
        ...

Step 2 - The Formulas


Now that the Devtest lab is well defined, it's time to add our formulas. If you had created some already from the portal, don't look for them in the template. At the moment, export won't script the formulas.

A quick way to get the JSON of your formulas is to create them from the portal and then use Azure Resources Explorer to get the code.
resourceExplorer
In a web browser, navigate to https://resources.azure.com, to open your Resource Explorer. Select the subscription, resource group, and lab that you are working on. In the node Formulas (4) you should see your formulas, click one and let's bring that JSON into our ARM template. Copy-paste it at the Resource level (the prime one, not the one inside the Lab).

Step 2.5 - The Azure KeyVault


You shouldn't put any password inside your ARM template, however, having them pre-define inside the formulas is pretty convenient. One solution is to use an Azure KeyVault.

Let's assume the KeyVault already exists, I will explain how to create it later. In your parameter file, add a parameter named adminPassword and let's reference the KeyVault. We also need to specify the secret we want to use. In this case, we will put the password in a secret named vmPassword.
    "adminPassword": {
        "reference": {
            "keyVault": {
                "id": "/subscriptions/{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}/resourceGroups/cloud5mins/providers/Microsoft.KeyVault/vaults/Cloud5minsVault"
            },
            "secretName": "vmPassword"
        }
    }
Now to get the password in the ARM template just use a regular parameter, and voila!

Step 3 - The ARM Claimable VMs


Now we have a Lab and the formulas, the only thing missing is the claimable VM based on the formulas. It's impossible to create in one ARM template both formulas and VMs. The alternative is to use a script that will create our VMs just after the deployment.
az group deployment create --name test-1 --resource-group cloud5mins --template-file DevTest.json --parameters DevTest.parameters.json --verbose

az lab vm create --lab-name C5M-DevTestLab -g  cloud5mins --name FrankDevBox --formula SimpleDevBox  
As you can see in the second Azure CLI command, we are creating a virtual machine named FrankDevBox based on the formula SimpleDevBox. Note that we don't need to specify any credential because everything was pre-defined in the formula. Pretty neat!

Here a part of a script that will create if it doesn't exist a KeyVault and populate it. Then it will deploy our ARM template and finally, create our claimable VM. You can find all the code on my GitHub project: Azure-Devtest-Lab-efficient-deployment-sample.

[...]

# Checking for a KeyVault
searchKeyVault=$(az keyvault list -g $resourceGroupName --query "[?name=='$keyvaultName'].name" -o tsv )
lenResult=${#searchKeyVault}

if [ ! $lenResult -gt 0 ] ;then
    echo "---> Creating keyvault: " $keyvaultName
    az keyvault create --name $keyvaultName --resource-group $resourceGroupName --location $resourceGroupLocation --enabled-for-template-deployment true
else
    echo "---> The Keyvaul $keyvaultName already exists"
fi


echo "---> Populating KeyVault..."
az keyvault secret set --vault-name $keyvaultName --name 'vmPassword' --value 'cr@zySheep42!'


# Deploy the DevTest Lab

echo "---> Deploying..."
az group deployment create --name $deploymentName --resource-group $resourceGroupName --template-file $templateFilePath --parameters $parameterFilePath --verbose

# Create the VMs using the formula created in the deployment

labName=$(az resource list -g cloud5mins --resource-type "Microsoft.DevTestLab/labs" --query [*].[name] --output tsv)
formulaName=$(az lab formula list -g $resourceGroupName  --lab-name $labName --query [*].[name] --output tsv)

echo "---> Creating VM(s)..."
az lab vm create --lab-name $labName -g  $resourceGroupName --name FrankSDevBox --formula $formulaName 
echo "---> done <--- code="">

In a video, please!


I also have a video of this post if you prefer.



Conclusion


Would it be for developing, testing, or training, as soon as you are creating environments in Azure, the DevTest Labs are definitely a must. It's a very powerful tool that not enough people know. Give it a try and let me know what do you do with the Azure DevTest Lab?


References:

  • Azure-Devtest-Lab-efficient-deployment-sample: https://github.com/FBoucher/Azure-Devtest-Lab-efficient-deployment-sample
  • An Overview of Azure DevTest Labs: https://www.youtube.com/watch?v=caO7AzOUxhQ
  • Best practices Using Azure Resource Manager (ARM) Templates: https://www.youtube.com/watch?v=myYTGsONrn0&t=7s
  • 5 Simple Steps to Get a Clean ARM Template: http://www.frankysnotes.com/2018/05/5-simple-steps-to-get-clean-arm-template.html



~

Reading Notes #339

IMG_20180725_154113

Cloud



Programming



Data



Miscellaneous



~Enjoy!


How to Deploy your Azure Functions Faster and Easily with Zip Push

Azure functions are great. I used to do a lot of "csx" version (C# scripted version) but more recently I switched to the compile version, and I definitely loved it! However, I was looking for a way to keep my deployment short and sweet, because sometimes I don't have time to setup a "big" CI/CD or simply because sometimes I'm not the one doing the deployment... In those cases, I need a simple script that will deploy everything! In this post, I will share with you how you can deploy everything with one easy script.

The Context


In this demo, I will deploy a simple C# (full .Net framework) Azure functions. I will create the Azure Function App and storage using an Azure Resource Manager (ARM template) and deploy with a method named Zip push or ZipDeploy. All the code, script, a template is available on my Github.

The Azure Functions Code


The Azure Function doesn't have to be special, and it can be any language supported by Azure Functions. Simply to show you everything, here the code of my function.


namespace AzFunctionZipDeploy
{
    public static class Function1
    {
        [FunctionName("GetTopRunner")]
        public static async Task Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");

            string top = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "top", true) == 0)
                .Value;

            if (top == null)
            {
                dynamic data = await req.Content.ReadAsAsync< object>();
                top = data?.top;
            }

        return top == null
                ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a number to get your top x runner on the query string or in the request body")
                : req.CreateResponse(HttpStatusCode.OK, new { message = $"Hello, here is your Top {top} runners", runners = A.ListOf(int.Parse(top)) });
        }
    }

    class Person
    {
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public int Age { get; set; }
    }
}

It's a really simple function that will return a list of Person generated on the fly. The list will contain as many person as the number passed in parameter. I'm using the very useful GenFu library, from my buddies: ASP.NET Monsters.

The only thing we need to do is to create our compress file (Zip or Rar) that contains everything our project required.

createZip

In this case, it's the project file (AzFunction-ZipDeploy.csproj), the function's code (Function1.cs) the host (host.json) and local settings of our function (local.settings.json).

The ARM template


For this demo, we need one Azure Function App. I will use a template that is part of the Azure Quickstart Templates. A quick look to the azuredeploy.parameters.json file and we see that the only parameter we really need to set is the name of our application.


{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "appName": {
        "value": "zipdeploydemo"
        }
    }
}

To be able to ZipDeploy, we need to add one Application Setting to let the Kudu interface we need its help to compile our code. To do that let's open the azuredeploy.json and go to the appSettings section. We need to add a new variable named: SCM_DO_BUILD_DURING_DEPLOYMENT and set it to true. After adding the setting it should look like this (see the last one... that's our new one):


"appSettings": [
    {
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    },
    {
    "name": "AzureWebJobsStorage",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    },
    {
    "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    },
    {
    "name": "WEBSITE_CONTENTSHARE",
    "value": "[toLower(variables('functionAppName'))]"
    },
    {
    "name": "FUNCTIONS_EXTENSION_VERSION",
    "value": "~1"
    },
    {
    "name": "WEBSITE_NODE_DEFAULT_VERSION",
    "value": "6.5.0"
    },
    {
    "name": "SCM_DO_BUILD_DURING_DEPLOYMENT",
    "value": true
    }
]

The Deployment Script


Now that all the pieces are ready it's time to put it together one script. In fact, only the two last commands are required; everything else is just stuff to make it easier to re-use it. Check out my previous post 5 Simple Steps to Get a Clean ARM Template, to learn more about the best practices related to ARM template. So let's see that script, it's pretty simple.

    # script to Create an Azure Gramophone-PoC Solution

    resourceGroupName=$1
    resourceGroupLocation=$2

    templateFilePath="./arm/azuredeploy.json"
    parameterFilePath="./arm/azuredeploy.parameters.json"

    dateToken=`date '+%Y%m%d%H%M'`
    deploymentName="FrankDemo"$dateToken

    # az login

    # You can select a specific subscription if you do not want to use the default
    # az account set -s SUBSCRIPTION_ID

    if !( $(az group exists -g  $resourceGroupName) ) then
        echo "---> Creating the Resourcegroup: " $resourceGroupName
        az group create -g $resourceGroupName -l $resourceGroupLocation
    else
        echo "---> Resourcegroup:" $resourceGroupName "already exists."
    fi

    az group deployment create --name $deploymentName --resource-group $resourceGroupName --template-file $templateFilePath --parameters $parameterFilePath --verbose

    echo "---> Deploying Function Code"
    az functionapp deployment source config-zip -g $resourceGroupName -n zipdeploydemo --src "./zip/AzFunction-ZipDeploy.zip"

    echo "---> done <--- code="">

The only "new" thing is the last command functionapp deployment source config-zip. That where we specify to the Azure Function App to look to --src to get our source. Because I'm running it locally, the path is pointing to a local folder. However, you could execute this command also in the CloudShell, and that would become a URI... to an Azure Blob Storage by example.

Deploy and Test


If you didn't notice yet, I did my script in bash and Azure CLI. That because I want my script to be compatible with all platforms. Of course, you could have done it in PowerShell or anything else that would call the REST API.

To deploy, just execute the script passing the ResourceGroup name, and its location.

    ./Deploy-AZ-Gramophone.sh cloud5mins eastus

ScriptOutputs

To get to Function URL, go to the Azure portal (portal.azure.com) and click on the Function App that we just deploy. Click on the function GetTopRunner in this case, and click on the </> Getfunction URL button.

GetFunctionURL

Use that URL in postman and pass another parameter top to see we the deployment ws successful.

postmanTest

In Video Please


If you prefer, I also have a video version of this post.



~Enjoy!