Showing posts with label pipeline. Show all posts
Showing posts with label pipeline. Show all posts

Reading Notes #543


Already time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week. 


If you think you may have interesting content, share it!


The Suggestion of the week

  • What is .NET, and why should you choose it? (.NET Team) - This is not an ordinary blog post. It makes me think of those deep interesting MSDN articles. A great read for young or older developers that would like to know more about .NET or refresh their memory.

Cloud

  • Azure DevOps Pipelines: If Expressions and Conditions (John Folberth) - This nice post in the series on Azure Pipeline focuses on conditions. Avery has efficient tools to customize our pipeline just like we want them.

  • What is an Azure Load Balancer? (Cary Roys) - Do you know what an Azure Load Balancer is or you only thing you know? In this nice post not only you will learn what it really is but it the post shares some OSS tools to test your ALB and explains how to use them.

Programming

Podcasts

  • Sustainable Open Source with Sarah Novotny (.NET Rocks!) - Great episode with the Microsoft Open Source Lead talking about why truly open source is important and how Microsft really believes in its future.
  • Digging Up the Past with Sarah Parcak (A Bit of Optimism) - Using satellite images to do archeology... wait what?! Great episode really interesting from the early second until the last one.

Miscellaneous


~Frank

Reading Notes #509


Good Monday, Already time to share new reading notes. Here is a list of all the articles, blog posts, and podcast episodes that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

Podcats

Miscellaneous


~frank

Reading Notes #508


It's... Tuesday! 
Yes I know one day later, but it's still time to share my reading notes. Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed. 

If you think you may have interesting content, share it!

Cloud

Programming

Podcast

Miscellaneous


~frank


Reading Notes #463




Every Monday, I share my "reading notes". This is a curated list of all the articles, blog posts, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.

Cloud

Programming

Databases

Podcasts

Miscellaneous

  • Fix for Elgato Key Light not found by Control Center (Scott Hanselman) - Sorry, but I feel happy to not be alone with those thoughts, and in that situation. My Keylight is now optional in my setup because when I need a light NOW, I don't have time to figure out issues.

~Frank


Reading Notes #448


Cloud

Programming

Miscellaneous

Podcasts

Reading Notes #417


Every Monday, I share my "reading notes". Those are the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting.
It's a mix of the actuality and what I consumed.
Enjoy!


Cloud

Programming

Podcast

Miscellaneous


~


Reading Notes #390


Suggestion of the week

Cloud

Programming

Podcasts

  • Economics of Kubernetes, with Owen Rog (Craig and Adam) - Really interesting episode. Of course all the news about Kubernetes were interesting, but even more the economics of cloud computing with the guess of the week, Owen Rog.

Miscellaneous

How to Deploy your Azure Function Automatically with ARM template (4 different ways)

It's so nice to be able to add some serverless components in our solution to make them better in a snap. But how do we manage them? In this post, I will explain how to create an Azure resource manager (ARM) template to deploy any Azure Function and show how I used this structure to deploy an open-source project I've been working on these days.

Part 1 - The ARM template

An ARM template is a JSON file that describes our architecture. To deploy an Azure Function we need at least three recourses: a functionApp, a service plan, and a storage account.


The FunctionApp is, of course, our function. The service plan could be set as dynamic or describe the type of resource that will be used by your function. The storage account is where is our code.


In the previous image, you can see how those components interact more with each other. Inside the Function, we will have a list of properties. One of those properties will be the Runtime, for example, in the AZUnzipEverything demo, it will be dotnet. Another property will be the connection string to our storage account that is also part of our ARM template. Since that resource doesn't exist yet, we will need to use the dynamic code.

The Function node will contain a sub-resource of type storageAccount. This is where we will specify where is our code, so it cant be clone to Azure.

Building ARM for a Simple Function


Let's see a template for a simple Azure Function that doesn't require any dependency, and we will examine it after.

You can use any text editor to edit your ARM template. However, the bundle VSCode with the extensions Azure Resource Manager Tools and Azure Resource Manager Snippets is particularly efficient.
{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {},
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "storageFunc",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "storageFunc"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2018-02-01",
            "name": "servicePlan",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "Y1",
                "tier": "Dynamic"
            },
            "properties": {
                "name": "servicePlan",
                "computeMode": "Dynamic"
            },
            "tags": {
                "displayName": "servicePlan"
            }
        },
         {
              "apiVersion": "2015-08-01",
              "type": "Microsoft.Web/sites",
              "name": "functionApp",
              "location": "[resourceGroup().location]",
              "kind": "functionapp",
              "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
                "[resourceId('Microsoft.Storage/storageAccounts', 'storageFunc')]"
              ],
              "properties": {
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
                "siteConfig": {
                  "appSettings": [
                    {
                      "name": "AzureWebJobsDashboard",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "AzureWebJobsStorage",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTSHARE",
                      "value": "storageFunc"
                    },
                    {
                      "name": "FUNCTIONS_EXTENSION_VERSION",
                      "value": "~2"
                    },
                    {
                      "name": "FUNCTIONS_WORKER_RUNTIME",
                      "value": "dotnet"
                    }
                  ]
                }
              },
              "resources": [
                  {
                      "apiVersion": "2015-08-01",
                      "name": "web",
                      "type": "sourcecontrols",
                      "dependsOn": [
                        "[resourceId('Microsoft.Web/sites/', 'functionApp')]"
                      ],
                      "properties": {
                          "RepoUrl": "https://github.com/FBoucher/AzUnzipEverything.git",
                          "branch": "master",
                          "publishRunbook": true,
                          "IsManualIntegration": true
                      }
                 }
              ]
            }
        
    ],
    "outputs": {}
}

The Storage Account


The first resources listed in the template is the Account Storage. There nothing specific about it.

The Service Plan


The service plan is the second resource in the list. It's important to notice that to be able to use the SKU Dynamic you will need at least the API version of apiVersion to be "2018-02-01". Then you specify the SKU.

    "sku": {
        "name": "Y1",
        "tier": "Dynamic"
    }

Of course, you can use the other SKU if you prefer.

The Function App


Final resources added to the mixt, and this is where all the pieces are getting together. It's important to notice that the other in which the resources are listed are not considered by Azure while deploying (it's only for us ;) ). To let Azure knows you need to add dependencies.

"dependsOn": [
    "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
    "[resourceId('Microsoft.Storage/storageAccounts', 'storageFunc')]"
]

This way the Azure Function will be created after the service plan and the storage account are available. Then in the properties we will be able to build the ConnectionString to the blob storage using a reference.

{
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
}

The last piece of the puzzle is the sub-resource sourcecontrol inside the FunctionApp. This will define where Azure should clone the code from and in which branch.

"resources": [
    {
        "apiVersion": "2015-08-01",
        "name": "web",
        "type": "sourcecontrols",
        "dependsOn": [
        "[resourceId('Microsoft.Web/sites/', 'functionApp')]"
        ],
        "properties": {
            "RepoUrl": "https://github.com/FBoucher/AzUnzipEverything.git",
            "branch": "master",
            "publishRunbook": true,
            "IsManualIntegration": true
        }
    }
]

To be sure that everything is fully automatic the properties publishRunbook and IsManualIntegration must be set as true. Otherwise, you will need to do a synchronization between your Git (in this case on GitHub), and the Git in Azure.

There is excellent documentation that explains many deferent scenarios to Automate resource deployment for your function app in Azure Functions

Azure Unzip Everything


To deploy the project AzUnzipEverything available on GitHub, I needed one more Azure Storage with pre-define containers (folders).


Of course, all the source code of both the Azure Function and the ARM template are available on GitHub, but let me highlight how the containers are defined from an ARM template.

"resources": [
    {
        "type": "blobServices/containers",
        "apiVersion": "2018-07-01",
        "name": "[concat('default/', 'input-files')]",
        "dependsOn": [
            "storageFiles"
        ],
        "properties": {
            "publicAccess": "Blob"
        }
    }
]

Just like with sourcecontrol, we will need to add a list of sub-resources to our storage account. The name MUST start by 'default/'.

Part 2 - Four Deployment Options

Now that we have a template that describes our needs we just need to deploy it. There are multiple ways it could be done, but let's see four of them.

Deploy from the Azure Portal


Navigate to the Azure Portal (https://azure.portal.com), from your favorite browser and search for "deploy a custom template" directly in the search bar located at the top of the screen (in the middle). Or go at https://portal.azure.com/#create/Microsoft.Template. One in the Custom deployment page, click on the link Build your own template in the editor. From there, you can copy-paste or upload your ARM template. You need to save it to see the real deployment form.


Deploy with a script


Would it be in PowerShell or in Azure CLI you can easily deploy your template with these two commands.

In Azure CLI

# create resource group
az group create -n AzUnzipEverything -l eastus

# deploy it
az group deployment create -n cloud5mins -g AzUnzipEverything --template-file "deployment\deployAzure.json" --parameters "deployment\deployAzure.parameters.json"  

In PowerShell

# create resource group
New-AzResourceGroup -Name AzUnzipEverything -Location eastus

# deploy it
New-AzResourceGroupDeployment -ResourceGroupName  AzUnzipEverything -TemplateFile deployment\deployAzure.json

Deploy to Azure Button


One of the best way to help people to deploy your solution in their Azure subscription is the Deploy to Azure Button.



You need to create an image link (in HTML or Markdown) to this to a special destination build in two-part.

The first one is a link to the Azure Portal:

https://portal.azure.com/#create/Microsoft.Template/uri/

And the second one is the location of your ARM template:

https%3A%2F%2Fraw.githubusercontent.com%2FFBoucher%2FAzUnzipEverything%2Fmaster%2Fdeployment%2FdeployAzure.json

However, this URL needs to be encoded. There is plenty of encoders online, but you can also do it from the terminal with the following command (A big thanks to @BrettMiller_IT who showed me this trick during one of my live streams).

[System.Web.HttpUtility]::UrlEncode("https://raw.githubusercontent.com/FBoucher/Not-a-Dog-Workshop/master/deployment/deployAzure.json")

Clicking the button will bring the user at the same page on the Azure Portal but in the user subscription.

Azure DevOps Pipeline

From the Azure DevOps portal (https://dev.azure.com), select your project and create a new Release Pipeline. Click on the + Add an artifact button to connect your Git repository.



Once it's added, you need to add a task the current job. Click on the link 1 job, 0 task (4). Now you just need to specify your Azure subscription, the name of the resource group and select the location of your ARM template inside your repository. To make the deployment automatic with each push in the repository, click that little lightning bolt and enable the Continuous deployment trigger.


Wrapping-up

Voila, you know have four different ways to deploy your Azure Function automatically. But don't take my word for it, try it yourself! If you need more details you can visit the project on GitHub or watch this video where I demo the content of this post.


Reading Notes #374

Cloud


Programming


Podcast

  • Hevesh5 - Making a YouTube Career from Viral Domino Art (#46) (That Creative Life) - Great show. An amazing story.
  • Azure Functions using Node with Simona Cotin (.NET Rocks!) - Great show. I just switch my website following that Jam stack pattern. I was planning to use Azure Functions to add a few little twists.... I'm happy to see that I not alone thinking like that!
  • 0230 - Alain Vezina - Le métier du DevOps (Visual Studio Talk Show) - Super épisode, très intéressant d'entendre parler du rôle de DevOps de quelqu'un qui le vie au quotidien. Merci de la suggestion, je crois, bien que je suis du pour relire The Pheonix Project.
  • Goal Setting Tips & Tracking KPIs (Video Pursuit Podcast) - Really interesting episode. Everybody is talking about matrix and KPI... But it's not frequent to hear about the "how". I really like how the goals are explained, achievable, but not easy... And how we should react when we don't reach them.

Miscellaneous


~ Good week!

Reading Notes #372

Suggestion of the week

Cloud

Programming


Miscellaneous

Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline

Static websites are lightning fast, and running them inside an Azure Blob Storage instead of a WebApp is incredibly economical (less than $1/ month). Does it mean you need to do everything manually? Absolutely not! In a previous post I explained how to automatically generated your static website using a Build Pipeline inside Azure DevOps. In this post, let's complete the CI-CD by creating a Release Pipeline to deploy it.

The Azure Resource Manager (ARM) Template


First thing first. If we want our release pipeline to deploy our website in Azure, we need first to be sure our Resources are available "up there." The best way to do this is by using an Azure Resource Manager (ARM template). I will use the same project started in the previous post, feel free to adapt to your structure or copy it from it.

Create a new file named deploy.json in the deployment folder. We need a simple storage account.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "StorageName": {
            "type":"string",
            "defaultValue": "cloudenfrancaisv2",
            "maxLength": 24
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "[parameters('StorageName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "[parameters('StorageName')]"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        }
    ],
    "outputs": {}
}

I used a parameter (StorageName) to define the name of the storage account. This way I could have multiple pipelines deploying in different storages.

Not to make the ARM template accessible to the release pipeline we also need to publish it. The easiest way to do it is to add another Copyfile task in our azure-pipeline. Add this task just before the PublishBuildArtifacts.

- task: CopyFiles@2
displayName: 'Copy deployment content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/deployment'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/deployment
    cleanTargetFolder: true

Once you commit and push these changes, it will trigger a build. When done, the ARM template will be available, and we will be able to start working on the release pipeline.

The Release Pipeline


Navigate to the DevOps project created in the previous post. This time, create a new Release Pipeline. When asked, select an empty template, we will pick manually the tasks we need.

First, we need to define the trigger and where are our artifacts. Click on the thing at the left of the screen. Select the build projects and let's use the latest version of the artifact to our deployment.

To get a continuous deployment, you need to enable it by clicking on the lightning bolt and selecting the enabled button.

Now let's select our tasks. Click on the "+" sign to add new tasks. We need three of these: Azure Resource Group Deployment, Azure CLI, and Azure File Copy.



Task 1 - Azure Resource Group Deployment


The first one will be an Azure Resource Group Deployment. The will be used to deploy our ARM template and be sure that the resources are available in Azure.

To configure the ARM deployment we need to select the Azure subscription and authorize the pipeline to have access. Then you will need to specify the name of the resource group you will be deploying into; it's location and finally points where is the linked ARM template.


Task 2 - Azure CLI


The second one is an Azure CLI. As I am writing this post, it's not possible to enable the static website property of a storage account. Therefore we will execute an Azure CLI command to change that configuration. Once you picked the Azure subscription, select inline script and enter this Azure CLI command:

az storage blob service-properties update --account-name wyamfrankdemo --static-website  --index-document index.html

This will enable the static website property of the storage account named wyamfrankdemo, and set the default document to index.html.

Task 3 - Azure File Copy


The last task is an Azure File Copy to copy all our files from $(System.DefaultWorkingDirectory)/drop/drop/outpout to the $web container (in our Azure Blob storage). The container must be named $web, that's the name used by Azure for the static website.

Wrapping up


Once you are done configuring the Release Pipeline, it's time to save it and run it. After only a minute or two (this demo is pretty small) the blog should be available into Azure. To find your endpoint (aka URL) you can go into the portal.azure.com and look at the static website property of the blob storage that we just create.

In a video, please!


I also have a video of this post if you prefer.





How to use Azure DevOps Pipeline and Cake to generate a static website

I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.

In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.

The Goal


The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.

The Static Website


In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.

For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.

output/

tool/
tools/

wwwroot/

config.wyam.dll
config.wyam.hash
config.wyam.packages.xml

Build Pipeline


The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.

The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.

On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:

#tool nuget:?package=Wyam&version=2.2.0
#addin nuget:?package=Cake.Wyam&version=2.1.3

var target = Argument("target", "Build");

Task("Build")
    .Does(() =>
    {
        Wyam( new WyamSettings {
            Recipe = "Blog",
            Theme = "CleanBlog"
        });        
    });

Task("Preview")
    .Does(() =>
    {
        Wyam(new WyamSettings
        {
            Recipe = "Blog",
            Theme = "CleanBlog",
            Preview = true,
            Watch = true
        });        
    });

RunTarget(target);

On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.

Now let's have a look at the azure-pipeline.yml file.

trigger:
- master

variables:
DOTNET_SDK_VERSION: '2.1.401'

pool:
vmImage: 'VS2017-Win2016'

steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core SDK $(DOTNET_SDK_VERSION)'
inputs:
    version: '$(DOTNET_SDK_VERSION)'

- powershell: ./build.ps1
displayName: 'Execute Cake PowerShell Bootstrapper'

- task: CopyFiles@2
displayName: 'Copy generated content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/output'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/outpout
    cleanTargetFolder: true

- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.

The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).

Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.

There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.

Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.

To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.



It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.

(Next post in the series - Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline)


In a video, please!

I also have a video of this post if you prefer.






References

Reading Notes #370

Cloud


Programming


Databases


Miscellaneous



Reading Notes #367

Suggestion of the week


Cloud


Programming


Miscellaneous


Reading Notes #366

Cloud


Programming


Databases


Miscellaneous



Reading Notes #364

Cloud




    Programming



    Miscellaneous




    ~

      Reading Notes #363


      Cloud




      Programming




      Databases




      Miscellaneous


      ~


      Reading Notes #358

      CakeLogoCloud


      Programming


      Miscellaneous


      ~

      Reading Notes #354

      Cloud


      Programming


      Books

      Extreme Ownership_coverExtreme Ownership: How U.S. Navy SEALs Lead and Win (Jocko Willink, Leif Babin) - Very interesting book. Yes, it contains a lot of battle details, and first I was not sure, but then things "fall" all in place when you understand what the story was "demonstrating." It also contains more business focus examples. Everything is very clear, well explained in plain English.









      ~

      Reading Notes #352

      Suggestion of the week


      Cloud


      Programming


      Data


      Miscellaneous


      Books

      • The Personal MBA: Master the Art of Business (Josh Kaufman) - An interesting book that helps to get in the mood, get prepared, and maybe for some whom weren't sure yet about the idea of a personal MBA (compare to the regular one)... Help to start planning and get moving. This book isn't an one book miracle MBA certification, but most likely a really good way to understand the journey the reader is about to start. The complete list of books to achieve this adventure is constantly updated and is available online. 


      ~