Reading Notes #399

Cloud

  • Azure DevOps Roadmap update for 2019 Q4 (Gloridel Morales) - Since the multi-stage pipeline launch in May, the team as been listening to his community. In this post learn more about what they have been working on and what is their roadmap.

Programming

  • Code Comments (Donn Felker) - Very smart idea! I'm staring using that rule right away.
  • Microservices Fundamentals (Mark Heath) - New course on Pluralsight about an indeed challenging topic. This post shares the plan of that Microservices course.
  • Stop Waiting! Start using Async and Await! (Simon Hawe) - Learn the power of async in this excellent post. The example may be in Python the idea is the same however language we are using.

Miscellaneous


Books

Superfans: The Easy Way to Stand Out, Grow Your Tribe, And Build a Successful Business

Author: Pat Flynn

I really like this book, and planning to read it again soon. I like the way things are simply explained. Like if you deconstructed a situation and then re-building it. It felt authentic and true. It's nothing transcending, but the way it is explained is great.




~


Let's the adventure beginning!


It's coming!  I'm not talking about winter here but I'm about Microsoft Ignite. One of the biggest Microsoft events for the DevOps and Dev amount us. This 4 days long event is a fantastic opportunity to get up to date with your favorite technology, learn the best practices, get certified and meet tons of experts to talk about your projects!



This year it's very special for me because I have the great pleasure to be part of the adventure, I will be presenting two sessions that are part of the Learning Paths.

Learning Path is a series of connected learning modules that include sessions, hands-on experiences, technical workshops, certifications, and expert connections. Learning Path’s work together to build upon what you’ve learned to provide a comprehensive set of skills to help you reach your goals.

Figuring out Azure Functions (AFUN95)
Tailwind Traders is curious about the concept behind “serverless” computing – the idea that they can run small pieces of code in the cloud, without having to worry about the underlying infrastructure. In this session, we cover the world of Azure Functions, starting with an explanation of the servers behind serverless, exploring the languages and integrations available, and ending with a demo of when to use Logic Apps and Microsoft Flow.

Options for building and running your app in the cloud (APPS10)
See how Tailwind Traders avoided a single point of failure using cloud services to deploy their company website to multiple regions. We cover all the options they considered, explain how and why they made their decisions, then dive into the components of their implementation. In this session, see how they used Microsoft technologies like Visual Studio Code, Azure Portal, and Azure CLI to build a secure application that runs and scales on Linux and Windows VMs and Azure Web Apps with a companion phone app.

But wait, there is more...


For the second time, after the Microsoft Ignite "the event", will starts another event a tour! This year, it is thirty (30) cities that Microsoft Ignite The Tour will visits! All continents, except for Antarctica (why?! 😉), will be visited. Check the complete list of the cities on the website and mark the dates on your calendar!

I will also be presenting many different sessions during this tour. While I'm not doing all of them, I'll be present on many occasions.

Looking forward to meeting you


So if you are planning to go to one of those events, and would like to meet to talk about your project, show some bugs, ask questions, or just to chat! Reach-out, It's ALWAYS a pleasure, and I'll bring some stickers and some special swag1 with me...

Drop me a message on Twitter, LinkedIn, YouTube, Instagram, and Facebook.


1: Follow me on Twitter to know more about this ;)

Reading Notes #398


Cloud


Programming


Miscellaneous



Books


10-Minute Focus: 25 Habits for Mastering Your Concentration and Eliminating Distractions 

(Daniel Walter)

Nothing new here but it's clear and very well explained. Honestly it good to revisit those productivity/ focus habits... It helps to stay on our toes...

Reading Notes #397


Suggestion of the week

Cloud

Programming

Miscellaneous

~

Reading Notes #396

Suggestion of the week

Programming

Miscellaneous

~


Reading Notes #395


Cloud


Programming


  • New workflow editor for GitHub Actions (Chris Patterson) - Have you tested the new GitHub action? If yes you will be pleased with this new editor...Ending the research of that missing space somewhere.

Miscellaneous

  • What You Need for Effective Remote Work (William Gant) - This is a full chapter of an up coming book about remote workers... If you are new to this adventure and even more if you are fulltime remote, this read is a must.

Books




Author: Gretchen Rubin 

I really enjoyed this book. I found very interesting the categorization of all those habits and comportment grouping. I like also the habits association to help to break some or creating new ones. It's obvious, but I didn't think about it before.



Cleaning your mess in the cloud automatically



We all do it. We create resources in the cloud for a demo, or a presentation and forget about them. Then at the end of the month, we receive a bigger invoice then expected and it's the panic.

This is why I thought about AzSubscriptionCleaner. It's an open-source project that could be deployed in your subscription very easily. The goal is to have it deployed by one click directly from GitHub.

The tool can be deployed in two versions, using Azure Automation, or Azure Functions. Based on a schedule it will execute a query to search all resources with a tag expireOn with the value is older than now(), and delete them.

I wrote two blog posts, paired with a YouTube video that explain how to tools where built.

Azure Automation


Read the complete post on Dev.to: Keep your Azure Subscription Clean Automatically

Video:


Azure Functions


Read the complete post on Dev.to: Use Azure Function to Clean-up your Mess, Automatically

Video:


GitHub Repo


This is an open-source project github.com/FBoucher/AzSubcriptionCleaner, you are welcome to see the code, clone the repository, ask for more feature or do a pull request to add a new one!

~

Reading Notes #394


Suggestion of the week


Cloud


Programming

  • Moving from jQuery to Vue (Shawn Wildermuth) - An interesting post that explains Vue and gives references from a jQuery user...like me.

Reading Notes #393


Suggestion of the week

  • GitHub stars won’t pay your rent (Kitze) - What a great story! This is an awesome journey of a developers who worked hard, took some risk and... Got result. All developer should read this.

Cloud

Programming

Miscellaneous

Reading Notes #392


The suggestion of the week


Cloud

  • Andrew Connell's Blog (Andrew Connell) - This nice post is the second of a series of three. It explains how to do every step but also why the author decided to do that.

Programming


Miscellaneous


~

Reading Notes #391


Suggestion of the week

  • How to Use Github Professionally (Aaron Stannard) - This post is great! Tons of information and best practices (with an explanation of why its a best practice).

Cloud


Programming


Books

Living with the Monks: What Turning Off My Phone Taught Me about Happiness, Gratitude, and Focus 

Author: Jesse Itzler

I really enjoyed this book. Yes it's light and funny, but don't get fool, there is a deeper message here. I think Jessy wins his challenge by going into a monastery so we don't have to. We all have what it takes to live a more purposeful life, we just need to pause. Showdown, to go faster, do less to do more... Embrace the silence.



~

Reading Notes #390


Suggestion of the week

Cloud

Programming

Podcasts

  • Economics of Kubernetes, with Owen Rog (Craig and Adam) - Really interesting episode. Of course all the news about Kubernetes were interesting, but even more the economics of cloud computing with the guess of the week, Owen Rog.

Miscellaneous

Reading Notes #389

Cloud


Programming


Miscellaneous

Reading Notes #388


Suggestion of the week

Cloud

Programming

Podcast

Miscellaneous

  • How To Develop Apps Like PUBG (Apoorv Gehlot) - An interesting article that gives us an idea of how a game like pugs got that success, and who they manage that rapid growth.
~

How to Deploy your Azure Function Automatically with ARM template (4 different ways)

It's so nice to be able to add some serverless components in our solution to make them better in a snap. But how do we manage them? In this post, I will explain how to create an Azure resource manager (ARM) template to deploy any Azure Function and show how I used this structure to deploy an open-source project I've been working on these days.

Part 1 - The ARM template

An ARM template is a JSON file that describes our architecture. To deploy an Azure Function we need at least three recourses: a functionApp, a service plan, and a storage account.


The FunctionApp is, of course, our function. The service plan could be set as dynamic or describe the type of resource that will be used by your function. The storage account is where is our code.


In the previous image, you can see how those components interact more with each other. Inside the Function, we will have a list of properties. One of those properties will be the Runtime, for example, in the AZUnzipEverything demo, it will be dotnet. Another property will be the connection string to our storage account that is also part of our ARM template. Since that resource doesn't exist yet, we will need to use the dynamic code.

The Function node will contain a sub-resource of type storageAccount. This is where we will specify where is our code, so it cant be clone to Azure.

Building ARM for a Simple Function


Let's see a template for a simple Azure Function that doesn't require any dependency, and we will examine it after.

You can use any text editor to edit your ARM template. However, the bundle VSCode with the extensions Azure Resource Manager Tools and Azure Resource Manager Snippets is particularly efficient.
{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {},
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "storageFunc",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "storageFunc"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2018-02-01",
            "name": "servicePlan",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "Y1",
                "tier": "Dynamic"
            },
            "properties": {
                "name": "servicePlan",
                "computeMode": "Dynamic"
            },
            "tags": {
                "displayName": "servicePlan"
            }
        },
         {
              "apiVersion": "2015-08-01",
              "type": "Microsoft.Web/sites",
              "name": "functionApp",
              "location": "[resourceGroup().location]",
              "kind": "functionapp",
              "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
                "[resourceId('Microsoft.Storage/storageAccounts', 'storageFunc')]"
              ],
              "properties": {
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
                "siteConfig": {
                  "appSettings": [
                    {
                      "name": "AzureWebJobsDashboard",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "AzureWebJobsStorage",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTSHARE",
                      "value": "storageFunc"
                    },
                    {
                      "name": "FUNCTIONS_EXTENSION_VERSION",
                      "value": "~2"
                    },
                    {
                      "name": "FUNCTIONS_WORKER_RUNTIME",
                      "value": "dotnet"
                    }
                  ]
                }
              },
              "resources": [
                  {
                      "apiVersion": "2015-08-01",
                      "name": "web",
                      "type": "sourcecontrols",
                      "dependsOn": [
                        "[resourceId('Microsoft.Web/sites/', 'functionApp')]"
                      ],
                      "properties": {
                          "RepoUrl": "https://github.com/FBoucher/AzUnzipEverything.git",
                          "branch": "master",
                          "publishRunbook": true,
                          "IsManualIntegration": true
                      }
                 }
              ]
            }
        
    ],
    "outputs": {}
}

The Storage Account


The first resources listed in the template is the Account Storage. There nothing specific about it.

The Service Plan


The service plan is the second resource in the list. It's important to notice that to be able to use the SKU Dynamic you will need at least the API version of apiVersion to be "2018-02-01". Then you specify the SKU.

    "sku": {
        "name": "Y1",
        "tier": "Dynamic"
    }

Of course, you can use the other SKU if you prefer.

The Function App


Final resources added to the mixt, and this is where all the pieces are getting together. It's important to notice that the other in which the resources are listed are not considered by Azure while deploying (it's only for us ;) ). To let Azure knows you need to add dependencies.

"dependsOn": [
    "[resourceId('Microsoft.Web/serverfarms', 'servicePlan')]",
    "[resourceId('Microsoft.Storage/storageAccounts', 'storageFunc')]"
]

This way the Azure Function will be created after the service plan and the storage account are available. Then in the properties we will be able to build the ConnectionString to the blob storage using a reference.

{
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', 'storageFunc', ';AccountKey=', listKeys('storageFunc','2015-05-01-preview').key1)]"
}

The last piece of the puzzle is the sub-resource sourcecontrol inside the FunctionApp. This will define where Azure should clone the code from and in which branch.

"resources": [
    {
        "apiVersion": "2015-08-01",
        "name": "web",
        "type": "sourcecontrols",
        "dependsOn": [
        "[resourceId('Microsoft.Web/sites/', 'functionApp')]"
        ],
        "properties": {
            "RepoUrl": "https://github.com/FBoucher/AzUnzipEverything.git",
            "branch": "master",
            "publishRunbook": true,
            "IsManualIntegration": true
        }
    }
]

To be sure that everything is fully automatic the properties publishRunbook and IsManualIntegration must be set as true. Otherwise, you will need to do a synchronization between your Git (in this case on GitHub), and the Git in Azure.

There is excellent documentation that explains many deferent scenarios to Automate resource deployment for your function app in Azure Functions

Azure Unzip Everything


To deploy the project AzUnzipEverything available on GitHub, I needed one more Azure Storage with pre-define containers (folders).


Of course, all the source code of both the Azure Function and the ARM template are available on GitHub, but let me highlight how the containers are defined from an ARM template.

"resources": [
    {
        "type": "blobServices/containers",
        "apiVersion": "2018-07-01",
        "name": "[concat('default/', 'input-files')]",
        "dependsOn": [
            "storageFiles"
        ],
        "properties": {
            "publicAccess": "Blob"
        }
    }
]

Just like with sourcecontrol, we will need to add a list of sub-resources to our storage account. The name MUST start by 'default/'.

Part 2 - Four Deployment Options

Now that we have a template that describes our needs we just need to deploy it. There are multiple ways it could be done, but let's see four of them.

Deploy from the Azure Portal


Navigate to the Azure Portal (https://azure.portal.com), from your favorite browser and search for "deploy a custom template" directly in the search bar located at the top of the screen (in the middle). Or go at https://portal.azure.com/#create/Microsoft.Template. One in the Custom deployment page, click on the link Build your own template in the editor. From there, you can copy-paste or upload your ARM template. You need to save it to see the real deployment form.


Deploy with a script


Would it be in PowerShell or in Azure CLI you can easily deploy your template with these two commands.

In Azure CLI

# create resource group
az group create -n AzUnzipEverything -l eastus

# deploy it
az group deployment create -n cloud5mins -g AzUnzipEverything --template-file "deployment\deployAzure.json" --parameters "deployment\deployAzure.parameters.json"  

In PowerShell

# create resource group
New-AzResourceGroup -Name AzUnzipEverything -Location eastus

# deploy it
New-AzResourceGroupDeployment -ResourceGroupName  AzUnzipEverything -TemplateFile deployment\deployAzure.json

Deploy to Azure Button


One of the best way to help people to deploy your solution in their Azure subscription is the Deploy to Azure Button.



You need to create an image link (in HTML or Markdown) to this to a special destination build in two-part.

The first one is a link to the Azure Portal:

https://portal.azure.com/#create/Microsoft.Template/uri/

And the second one is the location of your ARM template:

https%3A%2F%2Fraw.githubusercontent.com%2FFBoucher%2FAzUnzipEverything%2Fmaster%2Fdeployment%2FdeployAzure.json

However, this URL needs to be encoded. There is plenty of encoders online, but you can also do it from the terminal with the following command (A big thanks to @BrettMiller_IT who showed me this trick during one of my live streams).

[System.Web.HttpUtility]::UrlEncode("https://raw.githubusercontent.com/FBoucher/Not-a-Dog-Workshop/master/deployment/deployAzure.json")

Clicking the button will bring the user at the same page on the Azure Portal but in the user subscription.

Azure DevOps Pipeline

From the Azure DevOps portal (https://dev.azure.com), select your project and create a new Release Pipeline. Click on the + Add an artifact button to connect your Git repository.



Once it's added, you need to add a task the current job. Click on the link 1 job, 0 task (4). Now you just need to specify your Azure subscription, the name of the resource group and select the location of your ARM template inside your repository. To make the deployment automatic with each push in the repository, click that little lightning bolt and enable the Continuous deployment trigger.


Wrapping-up

Voila, you know have four different ways to deploy your Azure Function automatically. But don't take my word for it, try it yourself! If you need more details you can visit the project on GitHub or watch this video where I demo the content of this post.


Reading Notes #387

Cloud


Programming


Podcasts


Books



Dare to Lead: Brave Work. Tough Conversations. Whole Hearts. (Brené Brown) - A nice book. pack with a lot of information. A lot's of stories to emphasize her points, I always like that. It was maybe a little too much cartesian for me... many steps. Or maybe I was not in a good mindset. Good book however.








~

Reading Notes #386

Cloud


Programming


Databases


Miscellaneous


~


Reading Notes #385


Suggestion of the week

Programming


~


The Dog-Not-a-Dog Workshop


I recently presented, a workshop at the TOHack to get started with Azure. The goal was to try different Azure services, and see how we could augment an existing website using serverless function and artificial intelligence.
(Aussi disponible en français)


During this workshop, a website is deployed automatically from GitHub. Then by adding an Azure Function and using the Vision API of Azure Cognitive Services, the final solution is able to detect when uploaded pictures are or not dogs and keep our image folder "clean". We call that application: The automatic Not a Dog application.

The step by step instruction with the code can be found on GitHub - Not-a-Dog-Workshop. The workshop can be done in about 45-60 minutes.

I also did a video that is available on my YouTube channel:



You have questions, you are blocked, it will be a pleasure to help you.


Reading Notes #384

Programming

  • Install WSL 2 on Windows 10 (Thomas Maurer) - Awesome tutorial. If like me you didn't want to wait until the next Windows release or take the time to compile and debug a deployment....this tutorial is for us!

Databases


Miscellaneous

~

Reading Notes #383

Cloud


Programming


Miscellaneous


~

Be more Productive by using Inline Code in your Azure Logic App

In a project using Azure Logic Apps that I am working on, I needed to manipulate strings. I could create APIs or Azure Functions, but the code is very simple and is not using any external libraries. In this post, I will show you how to use the new Inline Code to execute your code snippet directly inside your Logic Apps.

Quick Context


The Logic App will read a file from my OneDrive (it will also work with DropBox, Box, etc.). Here an example of the file:

Nice tutorial that explains how to build, using postman, an efficient API.[cloud.azure.postman.tools]

The goal is to extract tags, contained between the square brackets, from the text.

Logic App: Get File Content


From the Azure Portal, create a new Logic App by clicking the big green "+" button in the top left corner and searching for Logic App.

For this demo, I will use the Interval as a trigger because I will execute the Logic App manually.

The first step will be a Get File Content action from the OneDrive connector. Once you authorized Azure to access your OneDrive folder, select the file you want to read. For me, it's /dev/simpleNote.txt

Integration Account


To access the workflowContext the Azure Logic App required an Integration account. Next step would be to create one. Save the current Logic App, and click on the big "+" button in the top right corner. This time search for integration. Select Integration Account, and complete the form to create it.


We now need to assign it to our Logic App. From the Logic App blade, in the options list select Workflow Settings. Then select your integration account, and don't forget to save!

Logic App: Inline Code


To add the action at the end of your workflow, click the New step button. Search for Inline Code, and select the action Execute JavaScript Code.


Before copy-pasting the code into the new Inline Code action let's have a quick look.

var note = "" + workflowContext.actions.Get_file_content.outputs.body;
var posTag = note.lastIndexOf("[") + 1;
var cleanNote = {};

if(posTag > 0){
        cleanNote.tags = note.substring(posTag, note.length-1);
        cleanNote.msg = note.substring(0,posTag-1);
    }
return cleanNote;

On the first line, we assign a variable note the content of the Get_file_content outputs. We access it using the workflowContext. This context has access to the trigger and the actions. To find the name of the action you can replace the spaces by the underscore character "_".


You can also switch to Code View, and see the name of all components from the JSON code.

Logic App: Use Inline Code Result


Of course, you can use the output of your Inline Code with other steps. You just need to use the Result from the dynamic content menu.


If for some reason the dynamic content list doesn't contain your Inline Code, you can always add the code directly @body('Cleaning_Note')?['body'].


Your Logic App should now look like this:


Verdict


The Inline code is very promising. Right now it's limited to JAvaScript and cannot access variable nor loops. However, for simple code that doesn't require any references, it's easier to maintain and deploy. You can learn more about what is exactly covered or not here.
And it works as this result shows.


You prefer watching instead or Reading


I also have a video of this post if you prefer.



References


Reading Notes #382

Cloud


Programming

~