Showing posts sorted by date for query Deploy. Sort by relevance Show all posts
Showing posts sorted by date for query Deploy. Sort by relevance Show all posts

Reading Notes #419


Every Monday, I share my "reading notes". Those are the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting.
It's a mix of the actuality and what I consumed. Enjoy!


Suggestion of the week

  • Approval Workflows With GitHub Actions (Aaron Powell) - Wow! That's a very clever and impressive way to have step flow in GitHub. All the details are in the post if you would like to create your own.

Cloud

Programming

Podcasts

Miscellaneous


Reading Notes #408


Every Monday, I share my reading notes. Those are the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.

Cloud

Programming

Miscellaneous


☁️

Reading Notes #405


Every week, I publish my reading notes. Those are the articles, blog posts, podcast episodes, and books that catch my interest and that I found interesting. It's a mix of the actuality and what I was looking for. This one is the last of 2019!

Cloud


Programming


Miscellaneous

  • Advice to my 20 year old self (Scott Hanselman) - An Interesting post. But to be honest, the more I think about it the less I would spoil things. So as good or bad as it sounds, my advice would probably just be something like thrust yourself, you'll be fine.
~

Deploy to Azure Directly From the Repository with GitHub Actions


You hear about that new GitHub Actions. Or maybe you didn't but would like to add a continuous integration, continuous deployment (CI-CD) to your web application. In this post, I will show you how to add a CI-CD to deploy automatically to Azure using the GitHub Actions.

What are GitHub Actions


GitHub Actions are automated workflows to do things. One of these could be a CI-CD. Using a workflow you could decide to compile and execute some unit tests at every push or pull request (PR). Another workflow could be that you deploy that application.

In this article, I will deploy a .Net Core application in Azure. However, you can use any languages you would like and deploy anywhere you like... I just needed to pick one :)

Now, let's get started.

Step 1 - The Code.


We need some code in a GitHub repo. Create a GitHub repo, clone it locally. And your app in it. I created mine with dotnet new blazorserver -n cloud5minsdemo -o src. Then commit and push.

Step 2 - Define the workflow


We got the code, now it's time to define our workflow. I will be providing all the code snippets required for the scenario cover in this post, but there is tons of template ready to be used available directly from your GitHub repository! Let's have a look. From your repository click on the Action tab, and voila!


When I wrote this post, a lot of available templates assumed the Azure resources already existed and you and adding a CI-CD to the mixt to automated your deployment. It's great but in my case, I was building a brand new web site so those didn't fit my needs. This is why I created my own template. The workflow I created was inspired by Azure/webapps-deploy. And there a lot of information also available on Deploy to App Service using GitHub Actions.

Let's add our template to our solution. GitHub will look in the folder .github/workflows/ from the root of the repository. Then create a file with the extension .yml

Here the code for my dotnet.yml, as any YAML file the secret is in the indentation as it is whitespace sensitive:

on: [push,pull_request]

env:
  AZURE_WEBAPP_NAME: cloud5minsdemo   # set this to your application's name
  AZURE_GROUP_NAME: cloud5mins2

jobs:
  build-and-deploy:
    runs-on: ubuntu-latest
    steps:

    # checkout the repo
    - uses: actions/checkout@master
    
    - name: Setup .NET Core
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version:  3.0.101


    # dotnet build and publish
    - name: Build with dotnet
      run: dotnet build ./src --configuration Release
    - name: dotnet publish 
      run: |
        dotnet publish ./src -c Release -o myapp 

    - uses: azure/login@v1
      with:
        creds: ${{ secrets.AZURE_CREDENTIALS }}

    - run: |
        az group create -n ${{ env.AZURE_GROUP_NAME }} -l eastus 
        az group deployment create -n ghaction -g ${{ env.AZURE_GROUP_NAME }} --template-file deployment/azuredepoy.json


    # deploy web app using Azure credentials
    - name: 'Azure webapp deploy'
      uses: azure/webapps-deploy@v1
      with:
        app-name: ${{ env.AZURE_WEBAPP_NAME }}
        package: './myapp' 

    # Azure logout 
    - name: logout
      run: |
        az logout


The Agent


There is a lot in there let's start by the first line. The on: is to define the trigger, in this case, the workflow will be trigger at every push or PR.

The env: is where you can declare variables. It's totally optional, but I think it will help then templates are more complex or simply to reuse them easily.

Then comes the jobs: definition. In this case, we will use the latest version of Ubuntu as our build agent. Of course, in a production environment, you should be more specify and select the OS that matches your needs. This job will have multiples steps defined in the, you guess it, steps: section/

We specify a branch to work with and set up our agent by:

uses: actions/setup-dotnet@v1
with:
  dotnet-version: 3.0.101

This is because I have a .Net Core project. For Node.js project, it would be

uses: actions/setup-node@v1
with:
  node-version: 10.x

And it would be a better idea to set the version as an environment variable to be able to change it quickly.

The next two instructions are really .Net Core focus as they will build and package the application into a folder myapp. Of course, in the "section" you could execute some unit test or any other validation that you may find useful.

The next section may be less obvious.

- uses: azure/login@v1
  with:
    creds: ${{ secrets.AZURE_CREDENTIALS }}

Access and Secrets


To have our GitHub Action to be able to create resources and deploy the code it needs to have access. The azure/login@v1 will let the Action login, using a Service Principal. In other words, we will create an authentication in the Azure Active Directory, with enough permission to do what we need.

Let's examine the following Azure CLI command:

`az ad sp create-for-rbac --name "c5m-Frankdemo" --role contributor --scopes /subscriptions/{subscription-id} --sdk-auth`

This will create a Service Principal named "c5m-Frankdemo" with the role "contributor" on the subscription specified. The role contributor can do mostly anything except granting permission.

Because no resources already existed the GitHub Action will require more permission. If you create the Resource Group outside of the CI-CD, you could limit the access only to this specific resource group. Using this command instead:

`az ad sp create-for-rbac --name "c5m-Frankdemo" --role contributor --scopes /subscriptions/{subscription-id}/resourceGroups/{resource-group} --sdk-auth`

The Azure CLI command will return a JSON. We will copy-paste this JSON into a GitHub secret. GitHub secrets encrypted secrets and allow you to store sensitive information, such as access tokens, in your repository. To access them go in the Settings of the repository and select Secrets from the left menu.


Click the Add a new secret button, and type AZURE_CREDENTIALS as the name. It could be anything, as long as you use that value in the YAML file describing the workflow. Put the JSON including the curly brackets in the Value textbox and click the save button.

Provisioning the Azure Resources


Now that the workflow has access we could execute some Azure CLI commands, but let's see what missing:

- run: |
    az group create -n ${{ env.AZURE_GROUP_NAME }} -l eastus 
    az group deployment create -n ghaction -g ${{ env.AZURE_GROUP_NAME }} --template-file deployment/azuredepoy.json --parameters myWebAppName=${{ env.AZURE_WEBAPP_NAME }}

The first command will create an Azure Resource Group, where all the resources will be created. The second one will deploy the website using an Azure Resource Manager (ARM) template. The --template-file deployment/azuredepoy.json tells us the template is a file named azuredeploy.json located in the folder deployment. Notice that the application name is passed to a parameter myWebAppName, using the environment variable.

An ARM template is simply a flat file that a lot like a JSON document. Use can use any text editor, I like doing mine with Visual Studio Code and two extensions: Azure Resource Manager Snippets, and Azure Resource Manager (ARM) Tools With those tools I can build ARM template very efficiently. For this template, we need a service plane and a web App. Here what the template looks like.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "myWebAppName": {
           "type": "string",
           "metadata": {
                "description": "WebAppName"
            }
        }
    },
    "variables": {},
    "resources": [
        {
            "name": "[parameters('myWebAppName')]",
            "type": "Microsoft.Web/sites",
            "apiVersion": "2016-08-01",
            "location": "[resourceGroup().location]",
            "tags": {
                "[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/frankdemoplan')]": "Resource",
                "displayName": "[parameters('myWebAppName')]"
            },
            "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', 'frankdemoplan')]"
            ],
            "properties": {
                "name": "[parameters('myWebAppName')]",
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', 'frankdemoplan')]"
            }
        },
        {
            "name": "frankdemoplan",
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2018-02-01",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "F1",
                "capacity": 1
            },
            "tags": {
                "displayName": "frankdemoplan"
            },
            "properties": {
                "name": "frankdemoplan"
            }
        }
    ],
    "outputs": {},
    "functions": []
}

This template is simple, it only contains the two required resources: a service plan, and a web app. To learn more about the ARM Template you can read my other post or check out this excellent introduction in the documentation.

Once the template is created and saved in its folder.

The deployment


There are only two last steps to the YAML file: the deployment and logout. Let's have a quick look at the deployment.

# deploy web app using Azure credentials
- name: 'Azure webapp deploy'
  uses: azure/webapps-deploy@v1
  with:
    app-name: ${{ env.AZURE_WEBAPP_NAME }}
    package: './myapp' 


Now that we are sure the resources exist in Azure we can deploy the code. This will be done with azure/webapps-deploy@v1 that will take the package generated by dotnet into myapp. Since we are already authenticated there is no need to specify anything at this point.

Everything is ready for the deployment. You just need to commit and push (into master) and the GitHub Action will be triggered. You can follow the deployment by going into the Actions tab.



After a few minutes, the website should be available in Azure. This post only shows a very simple build and deployment, but you can do so many things with those GitHub Actions, like executing tasks or packaging a container... I would love to know how you use them. Leave a comment or reach out on social media.


If you prefer, I also did a video of this post:



~

Reading Notes #403



Every week, I publish my reading notes. Those are the articles, blog posts, podcast episodes, and books that catch my interest and that I found interesting. It's a mix of the actuality and what I was looking for.

I passed the 4k subscribers mark on YouTube

The suggestion of the week


Cloud


Programming

  • ASP.NET Core updates in .NET Core 3.1 (Sourabh Shirhatti) - I'm very excited about this 3.1 version.I'm not sure why maybe it's because it is the long-time support (LTS). Nevertheless, I will update all my projects.

Miscellaneous

  • 7 Dangers of Micromanagement ( Jack Wallen) - Nice post to help you give the best version of yourself. (Note it's now 6, but it was 7 when I read it)

Books


Build Your Dream Stream

Ashley

Great book with many very interesting ideas.Not the same things that we ear elsewhere, or if it is, it's in very different words.








~

Let's the adventure beginning!


It's coming!  I'm not talking about winter here but I'm about Microsoft Ignite. One of the biggest Microsoft events for the DevOps and Dev amount us. This 4 days long event is a fantastic opportunity to get up to date with your favorite technology, learn the best practices, get certified and meet tons of experts to talk about your projects!



This year it's very special for me because I have the great pleasure to be part of the adventure, I will be presenting two sessions that are part of the Learning Paths.

Learning Path is a series of connected learning modules that include sessions, hands-on experiences, technical workshops, certifications, and expert connections. Learning Path’s work together to build upon what you’ve learned to provide a comprehensive set of skills to help you reach your goals.

Figuring out Azure Functions (AFUN95)
Tailwind Traders is curious about the concept behind “serverless” computing – the idea that they can run small pieces of code in the cloud, without having to worry about the underlying infrastructure. In this session, we cover the world of Azure Functions, starting with an explanation of the servers behind serverless, exploring the languages and integrations available, and ending with a demo of when to use Logic Apps and Microsoft Flow.

Options for building and running your app in the cloud (APPS10)
See how Tailwind Traders avoided a single point of failure using cloud services to deploy their company website to multiple regions. We cover all the options they considered, explain how and why they made their decisions, then dive into the components of their implementation. In this session, see how they used Microsoft technologies like Visual Studio Code, Azure Portal, and Azure CLI to build a secure application that runs and scales on Linux and Windows VMs and Azure Web Apps with a companion phone app.

But wait, there is more...


For the second time, after the Microsoft Ignite "the event", will starts another event a tour! This year, it is thirty (30) cities that Microsoft Ignite The Tour will visits! All continents, except for Antarctica (why?! 😉), will be visited. Check the complete list of the cities on the website and mark the dates on your calendar!

I will also be presenting many different sessions during this tour. While I'm not doing all of them, I'll be present on many occasions.

Looking forward to meeting you


So if you are planning to go to one of those events, and would like to meet to talk about your project, show some bugs, ask questions, or just to chat! Reach-out, It's ALWAYS a pleasure, and I'll bring some stickers and some special swag1 with me...

Drop me a message on Twitter, LinkedIn, YouTube, Instagram, and Facebook.


1: Follow me on Twitter to know more about this ;)

Reading Notes #388


Suggestion of the week

Cloud

Programming

Podcast

Miscellaneous

  • How To Develop Apps Like PUBG (Apoorv Gehlot) - An interesting article that gives us an idea of how a game like pugs got that success, and who they manage that rapid growth.
~

How to Deploy your Azure Function Automatically with ARM template (4 different ways)

It's so nice to be able to add some serverless components in our solution to make them better in a snap. But how do we manage them? In this post, I will explain how to create an Azure resource manager (ARM) template to deploy any Azure Function and show how I used this structure to deploy an open-source project I've been working on these days.

Part 1 - The ARM template

An ARM template is a JSON file that describes our architecture. To deploy an Azure Function we need at least three recourses: a functionApp, a service plan, and a storage account.


The FunctionApp is, of course, our function. The service plan could be set as dynamic or describe the type of resource that will be used by your function. The storage account is where is our code.


In the previous image, you can see how those components interact more with each other. Inside the Function, we will have a list of properties. One of those properties will be the Runtime, for example, in the AZUnzipEverything demo, it will be dotnet. Another property will be the connection string to our storage account that is also part of our ARM template. Since that resource doesn't exist yet, we will need to use the dynamic code.

The Function node will contain a sub-resource of type storageAccount. This is where we will specify where is our code, so it cant be clone to Azure.

Building ARM for a Simple Function


Let's see a template for a simple Azure Function that doesn't require any dependency, and we will examine it after.

You can use any text editor to edit your ARM template. However, the bundle VSCode with the extensions Azure Resource Manager Tools and Azure Resource Manager Snippets is particularly efficient.
{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {},
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "storageFunc",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "storageFunc"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2018-02-01",
            "name": "servicePlan",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "Y1",
                "tier": "Dynamic"
            },
            "properties": {
                "name": "servicePlan",
                "computeMode": "Dynamic"
            },
            "tags": {
                "displayName": "servicePlan"
            }
        },
         {
              "apiVersion": "2015-08-01",
              "type": "Microsoft.Web/sites",
              "name": "functionApp",
              "location": "[resourceGroup().location]",
              "kind": "functionapp",
              "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
                "[resourceId('Microsoft.Storage/storageAccounts', 'storageFunc')]"
              ],
              "properties": {
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
                "siteConfig": {
                  "appSettings": [
                    {
                      "name": "AzureWebJobsDashboard",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "AzureWebJobsStorage",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTSHARE",
                      "value": "storageFunc"
                    },
                    {
                      "name": "FUNCTIONS_EXTENSION_VERSION",
                      "value": "~2"
                    },
                    {
                      "name": "FUNCTIONS_WORKER_RUNTIME",
                      "value": "dotnet"
                    }
                  ]
                }
              },
              "resources": [
                  {
                      "apiVersion": "2015-08-01",
                      "name": "web",
                      "type": "sourcecontrols",
                      "dependsOn": [
                        "[resourceId('Microsoft.Web/sites/', 'functionApp')]"
                      ],
                      "properties": {
                          "RepoUrl": "https://github.com/FBoucher/AzUnzipEverything.git",
                          "branch": "master",
                          "publishRunbook": true,
                          "IsManualIntegration": true
                      }
                 }
              ]
            }
        
    ],
    "outputs": {}
}

The Storage Account


The first resources listed in the template is the Account Storage. There nothing specific about it.

The Service Plan


The service plan is the second resource in the list. It's important to notice that to be able to use the SKU Dynamic you will need at least the API version of apiVersion to be "2018-02-01". Then you specify the SKU.

    "sku": {
        "name": "Y1",
        "tier": "Dynamic"
    }

Of course, you can use the other SKU if you prefer.

The Function App


Final resources added to the mixt, and this is where all the pieces are getting together. It's important to notice that the other in which the resources are listed are not considered by Azure while deploying (it's only for us ;) ). To let Azure knows you need to add dependencies.

"dependsOn": [
    "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
    "[resourceId('Microsoft.Storage/storageAccounts', 'storageFunc')]"
]

This way the Azure Function will be created after the service plan and the storage account are available. Then in the properties we will be able to build the ConnectionString to the blob storage using a reference.

{
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
}

The last piece of the puzzle is the sub-resource sourcecontrol inside the FunctionApp. This will define where Azure should clone the code from and in which branch.

"resources": [
    {
        "apiVersion": "2015-08-01",
        "name": "web",
        "type": "sourcecontrols",
        "dependsOn": [
        "[resourceId('Microsoft.Web/sites/', 'functionApp')]"
        ],
        "properties": {
            "RepoUrl": "https://github.com/FBoucher/AzUnzipEverything.git",
            "branch": "master",
            "publishRunbook": true,
            "IsManualIntegration": true
        }
    }
]

To be sure that everything is fully automatic the properties publishRunbook and IsManualIntegration must be set as true. Otherwise, you will need to do a synchronization between your Git (in this case on GitHub), and the Git in Azure.

There is excellent documentation that explains many deferent scenarios to Automate resource deployment for your function app in Azure Functions

Azure Unzip Everything


To deploy the project AzUnzipEverything available on GitHub, I needed one more Azure Storage with pre-define containers (folders).


Of course, all the source code of both the Azure Function and the ARM template are available on GitHub, but let me highlight how the containers are defined from an ARM template.

"resources": [
    {
        "type": "blobServices/containers",
        "apiVersion": "2018-07-01",
        "name": "[concat('default/', 'input-files')]",
        "dependsOn": [
            "storageFiles"
        ],
        "properties": {
            "publicAccess": "Blob"
        }
    }
]

Just like with sourcecontrol, we will need to add a list of sub-resources to our storage account. The name MUST start by 'default/'.

Part 2 - Four Deployment Options

Now that we have a template that describes our needs we just need to deploy it. There are multiple ways it could be done, but let's see four of them.

Deploy from the Azure Portal


Navigate to the Azure Portal (https://azure.portal.com), from your favorite browser and search for "deploy a custom template" directly in the search bar located at the top of the screen (in the middle). Or go at https://portal.azure.com/#create/Microsoft.Template. One in the Custom deployment page, click on the link Build your own template in the editor. From there, you can copy-paste or upload your ARM template. You need to save it to see the real deployment form.


Deploy with a script


Would it be in PowerShell or in Azure CLI you can easily deploy your template with these two commands.

In Azure CLI

# create resource group
az group create -n AzUnzipEverything -l eastus

# deploy it
az group deployment create -n cloud5mins -g AzUnzipEverything --template-file "deployment\deployAzure.json" --parameters "deployment\deployAzure.parameters.json"  

In PowerShell

# create resource group
New-AzResourceGroup -Name AzUnzipEverything -Location eastus

# deploy it
New-AzResourceGroupDeployment -ResourceGroupName  AzUnzipEverything -TemplateFile deployment\deployAzure.json

Deploy to Azure Button


One of the best way to help people to deploy your solution in their Azure subscription is the Deploy to Azure Button.



You need to create an image link (in HTML or Markdown) to this to a special destination build in two-part.

The first one is a link to the Azure Portal:

https://portal.azure.com/#create/Microsoft.Template/uri/

And the second one is the location of your ARM template:

https%3A%2F%2Fraw.githubusercontent.com%2FFBoucher%2FAzUnzipEverything%2Fmaster%2Fdeployment%2FdeployAzure.json

However, this URL needs to be encoded. There is plenty of encoders online, but you can also do it from the terminal with the following command (A big thanks to @BrettMiller_IT who showed me this trick during one of my live streams).

[System.Web.HttpUtility]::UrlEncode("https://raw.githubusercontent.com/FBoucher/Not-a-Dog-Workshop/master/deployment/deployAzure.json")

Clicking the button will bring the user at the same page on the Azure Portal but in the user subscription.

Azure DevOps Pipeline

From the Azure DevOps portal (https://dev.azure.com), select your project and create a new Release Pipeline. Click on the + Add an artifact button to connect your Git repository.



Once it's added, you need to add a task the current job. Click on the link 1 job, 0 task (4). Now you just need to specify your Azure subscription, the name of the resource group and select the location of your ARM template inside your repository. To make the deployment automatic with each push in the repository, click that little lightning bolt and enable the Continuous deployment trigger.


Wrapping-up

Voila, you know have four different ways to deploy your Azure Function automatically. But don't take my word for it, try it yourself! If you need more details you can visit the project on GitHub or watch this video where I demo the content of this post.


Be more Productive by using Inline Code in your Azure Logic App

In a project using Azure Logic Apps that I am working on, I needed to manipulate strings. I could create APIs or Azure Functions, but the code is very simple and is not using any external libraries. In this post, I will show you how to use the new Inline Code to execute your code snippet directly inside your Logic Apps.

Quick Context


The Logic App will read a file from my OneDrive (it will also work with DropBox, Box, etc.). Here an example of the file:

Nice tutorial that explains how to build, using postman, an efficient API.[cloud.azure.postman.tools]

The goal is to extract tags, contained between the square brackets, from the text.

Logic App: Get File Content


From the Azure Portal, create a new Logic App by clicking the big green "+" button in the top left corner and searching for Logic App.

For this demo, I will use the Interval as a trigger because I will execute the Logic App manually.

The first step will be a Get File Content action from the OneDrive connector. Once you authorized Azure to access your OneDrive folder, select the file you want to read. For me, it's /dev/simpleNote.txt

Integration Account


To access the workflowContext the Azure Logic App required an Integration account. Next step would be to create one. Save the current Logic App, and click on the big "+" button in the top right corner. This time search for integration. Select Integration Account, and complete the form to create it.


We now need to assign it to our Logic App. From the Logic App blade, in the options list select Workflow Settings. Then select your integration account, and don't forget to save!

Logic App: Inline Code


To add the action at the end of your workflow, click the New step button. Search for Inline Code, and select the action Execute JavaScript Code.


Before copy-pasting the code into the new Inline Code action let's have a quick look.

var note = "" + workflowContext.actions.Get_file_content.outputs.body;
var posTag = note.lastIndexOf("[") + 1;
var cleanNote = {};

if(posTag > 0){
        cleanNote.tags = note.substring(posTag, note.length-1);
        cleanNote.msg = note.substring(0,posTag-1);
    }
return cleanNote;

On the first line, we assign a variable note the content of the Get_file_content outputs. We access it using the workflowContext. This context has access to the trigger and the actions. To find the name of the action you can replace the spaces by the underscore character "_".


You can also switch to Code View, and see the name of all components from the JSON code.

Logic App: Use Inline Code Result


Of course, you can use the output of your Inline Code with other steps. You just need to use the Result from the dynamic content menu.


If for some reason the dynamic content list doesn't contain your Inline Code, you can always add the code directly @body('Cleaning_Note')?['body'].


Your Logic App should now look like this:


Verdict


The Inline code is very promising. Right now it's limited to JAvaScript and cannot access variable nor loops. However, for simple code that doesn't require any references, it's easier to maintain and deploy. You can learn more about what is exactly covered or not here.
And it works as this result shows.


You prefer watching instead or Reading


I also have a video of this post if you prefer.



References


Reading Notes #381

Suggestion of the week


Cloud


Programming


Miscellaneous


Books

Author: Eric Barker

Nice book. There is always a good story to make a correlation with his current point. Then it could go in a different direction with another story. All the stories are complementary and are adding layer by layer to the more complex message that is delivered to us. Easy to read, enjoyable from the beginning until the last word.

How to make your deployments successful every time

You are done with your code and you are ready to deploy it in Azure. You execute the PowerShell or Bash script you have and BOOM! The error message saying that this name is already taken. In this post, I will show you a simple way to look like a boss and make your deployment working all the time.

____ with given name ____ already exists.

The tricks other use


You could try to add a digit at the end of the resource name (ex: demo-app1, demo-app2, demo-app123...), but that’s not really professional. You could create a random string and append it to the name. Yes, that will works, once. If you are trying to redeploy your resources that value will change, therefore it will never be the same.
The solution would be to have a unique string that is constant in our environment.

The solution


The solution is to use the function UniqueString() part of the Azure Resource Manager (ARM) template. If we look in the documentation, UniqueString creates a deterministic hash string based on the values provided as parameters. Let’s see a quick example of an ARM template to deploy a website named demo-app.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {},
    "variables": {
        "webAppName": "demo-app"
    },
    "resources": [
        {
            "type": "Microsoft.Web/sites",
            "apiVersion": "2015-08-01",
            "name": "[variables('webAppName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/frankdemo-plan')]": "Resource",
                "displayName": "[variables('webAppName')]"
            },
            "dependsOn": [
                "Microsoft.Web/serverfarms/frankdemo-plan"
            ],
            "properties": {
                "name": "[variables('webAppName')]",
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms/', 'frankdemo-plan')]"
            }
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2016-09-01",
            "name": "frankdemo-plan",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "F1",
                "capacity": 1
            },
            "tags": {
                "displayName": "frankdemo-plan"
            },
            "properties": {
                "name": "frankdemo-plan"
            }
        }
    ],
    "outputs": {}
}

If you try to deploy this template, you will have an error because the name demo-app is already taken... no surprise here.

Let’s create a new variable suffix and we will use the Resource Group Id and Location as values. Then we just need to append this value to our name using the function concat().

    "variables": {
        "suffix": "[uniqueString(resourceGroup().id, resourceGroup().location)]",
        "webAppName": "[concat('demo-app', variables('suffix'))]"
    }

It’s that simple! Now every time you will deploy a unique string will be added to your resource name. That string will always be the same for a Resource Group-Location deployment.

Because some resource types are more restrictive than others you may need adapt your new name. Maybe the name of your resource plus those thirteen characters hash will be too long... No problem, you can easily make it shorter and all lower case just by using substring() and toLower().

 "parameters": {},
    "variables": {
        "suffix": "[substring(toLower(uniqueString(resourceGroup().id, resourceGroup().location)),0,5)]",
        "webAppName": "[concat('demo-app', variables('suffix'))]"
    }

Voila, and now by using ARM template you can deploy and redeploy without any problem reproducing the same solution you built. To learn move about the ARM template you can jump in the documentation, where you will find samples, step-by-step tutorials and more.


If you have a specific question about ARM templates or if you would like to see more tips like this one, don't hesitate to ask in the comments section or reach out on social media!

In a video, please!


I also have a video of this post if you prefer.




Image by StartupStockPhotos from Pixabay

Reading Notes #376

Cloud

Miscellaneous

Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline

Static websites are lightning fast, and running them inside an Azure Blob Storage instead of a WebApp is incredibly economical (less than $1/ month). Does it mean you need to do everything manually? Absolutely not! In a previous post I explained how to automatically generated your static website using a Build Pipeline inside Azure DevOps. In this post, let's complete the CI-CD by creating a Release Pipeline to deploy it.

The Azure Resource Manager (ARM) Template


First thing first. If we want our release pipeline to deploy our website in Azure, we need first to be sure our Resources are available "up there." The best way to do this is by using an Azure Resource Manager (ARM template). I will use the same project started in the previous post, feel free to adapt to your structure or copy it from it.

Create a new file named deploy.json in the deployment folder. We need a simple storage account.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "StorageName": {
            "type":"string",
            "defaultValue": "cloudenfrancaisv2",
            "maxLength": 24
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "[parameters('StorageName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "[parameters('StorageName')]"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        }
    ],
    "outputs": {}
}

I used a parameter (StorageName) to define the name of the storage account. This way I could have multiple pipelines deploying in different storages.

Not to make the ARM template accessible to the release pipeline we also need to publish it. The easiest way to do it is to add another Copyfile task in our azure-pipeline. Add this task just before the PublishBuildArtifacts.

- task: CopyFiles@2
displayName: 'Copy deployment content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/deployment'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/deployment
    cleanTargetFolder: true

Once you commit and push these changes, it will trigger a build. When done, the ARM template will be available, and we will be able to start working on the release pipeline.

The Release Pipeline


Navigate to the DevOps project created in the previous post. This time, create a new Release Pipeline. When asked, select an empty template, we will pick manually the tasks we need.

First, we need to define the trigger and where are our artifacts. Click on the thing at the left of the screen. Select the build projects and let's use the latest version of the artifact to our deployment.

To get a continuous deployment, you need to enable it by clicking on the lightning bolt and selecting the enabled button.

Now let's select our tasks. Click on the "+" sign to add new tasks. We need three of these: Azure Resource Group Deployment, Azure CLI, and Azure File Copy.



Task 1 - Azure Resource Group Deployment


The first one will be an Azure Resource Group Deployment. The will be used to deploy our ARM template and be sure that the resources are available in Azure.

To configure the ARM deployment we need to select the Azure subscription and authorize the pipeline to have access. Then you will need to specify the name of the resource group you will be deploying into; it's location and finally points where is the linked ARM template.


Task 2 - Azure CLI


The second one is an Azure CLI. As I am writing this post, it's not possible to enable the static website property of a storage account. Therefore we will execute an Azure CLI command to change that configuration. Once you picked the Azure subscription, select inline script and enter this Azure CLI command:

az storage blob service-properties update --account-name wyamfrankdemo --static-website  --index-document index.html

This will enable the static website property of the storage account named wyamfrankdemo, and set the default document to index.html.

Task 3 - Azure File Copy


The last task is an Azure File Copy to copy all our files from $(System.DefaultWorkingDirectory)/drop/drop/outpout to the $web container (in our Azure Blob storage). The container must be named $web, that's the name used by Azure for the static website.

Wrapping up


Once you are done configuring the Release Pipeline, it's time to save it and run it. After only a minute or two (this demo is pretty small) the blog should be available into Azure. To find your endpoint (aka URL) you can go into the portal.azure.com and look at the static website property of the blob storage that we just create.

In a video, please!


I also have a video of this post if you prefer.





How to use Azure DevOps Pipeline and Cake to generate a static website

I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.

In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.

The Goal


The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.

The Static Website


In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.

For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.

output/

tool/
tools/

wwwroot/

config.wyam.dll
config.wyam.hash
config.wyam.packages.xml

Build Pipeline


The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.

The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.

On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:

#tool nuget:?package=Wyam&version=2.2.0
#addin nuget:?package=Cake.Wyam&version=2.1.3

var target = Argument("target", "Build");

Task("Build")
    .Does(() =>
    {
        Wyam( new WyamSettings {
            Recipe = "Blog",
            Theme = "CleanBlog"
        });        
    });

Task("Preview")
    .Does(() =>
    {
        Wyam(new WyamSettings
        {
            Recipe = "Blog",
            Theme = "CleanBlog",
            Preview = true,
            Watch = true
        });        
    });

RunTarget(target);

On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.

Now let's have a look at the azure-pipeline.yml file.

trigger:
- master

variables:
DOTNET_SDK_VERSION: '2.1.401'

pool:
vmImage: 'VS2017-Win2016'

steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core SDK $(DOTNET_SDK_VERSION)'
inputs:
    version: '$(DOTNET_SDK_VERSION)'

- powershell: ./build.ps1
displayName: 'Execute Cake PowerShell Bootstrapper'

- task: CopyFiles@2
displayName: 'Copy generated content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/output'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/outpout
    cleanTargetFolder: true

- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.

The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).

Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.

There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.

Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.

To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.



It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.

(Next post in the series - Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline)


In a video, please!

I also have a video of this post if you prefer.






References

Reading Notes #370

Cloud


Programming


Databases


Miscellaneous



How to Unzip Automatically your Files with Azure Function v2

I published a video that explains how to UnZip files without any code by using Logic Apps. However, that solution didn't work for bigger files or different archive type. This post is to explain how you can use the Azure Function to cover those situations. This first iteration supports "Zip" files; all the code is available in my GitHub.

Prerequisites


To create the Azure Function, I will use the excellent Azure Function extension of Visual Studio Code. You don't "need" it. However, it makes thing very easy.


You can easily install the extension from Inside Visual Studio Code by clicking on the extension button in the left menu. You will also need to install the Azure Function Core Tools

Creating the Function


Once the extension installed, you will find a new button in the left menu. That opens a new section with four new option: Create New Project, Create Function, Deploy to Function App, and Refresh.


Click on the first option Create New Project. Select a local folder and a language; for this demo, I will use C#. This will create a few files and folder. Now let's create our Function. From the extension menu, select the second option Create Function. Create a Blob Trigger named UnzipThis into the folder we just created, and select (or create) Resource Group, Storage Account, and location in your subscription. After a few seconds, another question will pop asking the name of the container that our blob trigger monitors. For this demo, input-files is used.

Once the function is created you will see this warning message.


What that means is that to be able to debug locally we will need to set the setting AzureWebJobsStorage to UseDevelopmentStorage=true in the local.settings.json file. It will look like this.

{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "UseDevelopmentStorage=true",
        "FUNCTIONS_WORKER_RUNTIME": "dotnet",
        "unziptools_STORAGE": "DefaultEndpointsProtocol=https;AccountName=unziptools;AccountKey=XXXXXXXXX;EndpointSuffix=core.windows.net",
    }
}

Open the file UnzipThis.cs; this is our function. On the first line of the function, you can see that the Blob trigger is defined.

[BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]Stream myBlob

The binding is attached to the container named input-files, from the storage account reachable by the connection "cloud5mins_storage". The real connectionString is in the local.settings.json file.

Now, let's put the code we need for our demo:

[FunctionName("Unzipthis")]
public static async Task Run([BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]CloudBlockBlob myBlob, string name, ILogger log)
{
    log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name}");

    string destinationStorage = Environment.GetEnvironmentVariable("destinationStorage");
    string destinationContainer = Environment.GetEnvironmentVariable("destinationContainer");

    try{
        if(name.Split('.').Last().ToLower() == "zip"){

            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(destinationStorage);
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference(destinationContainer);
            
            using(MemoryStream blobMemStream = new MemoryStream()){

                await myBlob.DownloadToStreamAsync(blobMemStream);

                using(ZipArchive archive = new ZipArchive(blobMemStream))
                {
                    foreach (ZipArchiveEntry entry in archive.Entries)
                    {
                        log.LogInformation($"Now processing {entry.FullName}");

                        //Replace all NO digits, letters, or "-" by a "-" Azure storage is specific on valid characters
                        string valideName = Regex.Replace(entry.Name,@"[^a-zA-Z0-9\-]","-").ToLower();

                        CloudBlockBlob blockBlob = container.GetBlockBlobReference(valideName);
                        using (var fileStream = entry.Open())
                        {
                            await blockBlob.UploadFromStreamAsync(fileStream);
                        }
                    }
                }
            }
        }
    }
    catch(Exception ex){
        log.LogInformation($"Error! Something went wrong: {ex.Message}");

    }            
}

UPDATED: Thanks to Stefano Tedeschi who found a bug and suggested a fix.

The source of our compressed file is defined in the trigger. To define the destination destinationStorage and destinationContainer are used. Their value are saved into local.settings.json. Then because this function only supports .zip file a little validation was required.

Next, we create an archive instance using the new System.IO.Compression library. We then create references to the storage account, blob, and container. It not possible to used second binding here because for one archive file you have a variable number of potential extracted files. The bindings are static; therefore we need to use the regular storage API.

Then for every file (aka entry) in the archive the code upload it to the destination storage.

Deploying


To deploy the function, from the Azure Function extension click on the third option: Deploy to Function App. Select your subscription and Function App name.

Now we need to configure our settings in Azure. By default, the local.setting are NOT used. Once again the extension is handy.


Under the subscription expand the freshly deployed Function App AzUnzipEverything, and right-click on Application Settings. Use Add New Setting to create cloud5mins_storage, destinationStorage and destinationContainer.

The function is now deployed and the settings are set, now we only need to create the blob storage containers, and we will be able to test the function. You can easily do that directly from the Azure portal (portal.azure.com).

You are now ready to upload a file into the input-files container.

Let's Code Together


This first iteration only supports "Zip" files. All the code is available on GitHub. Feel free to use it. If you would like to see or add support for other archive types join me on GitHub!.

In a video, please!


I also have a video of this post if you prefer.



I also have an extended version where I introduce more the Visual Studio Extension to work with Azure Function. And explain more details about the Azure Function V2.