Showing posts with label function. Show all posts
Showing posts with label function. Show all posts

Reading Notes #372

Suggestion of the week

Cloud

Programming


Miscellaneous

Reading Notes #369

Cloud


Programming


Miscellaneous


~

How to Unzip Automatically your Files with Azure Function v2

I published a video that explains how to UnZip files without any code by using Logic Apps. However, that solution didn't work for bigger files or different archive type. This post is to explain how you can use the Azure Function to cover those situations. This first iteration supports "Zip" files; all the code is available in my GitHub.

Prerequisites


To create the Azure Function, I will use the excellent Azure Function extension of Visual Studio Code. You don't "need" it. However, it makes thing very easy.


You can easily install the extension from Inside Visual Studio Code by clicking on the extension button in the left menu. You will also need to install the Azure Function Core Tools

Creating the Function


Once the extension installed, you will find a new button in the left menu. That opens a new section with four new option: Create New Project, Create Function, Deploy to Function App, and Refresh.


Click on the first option Create New Project. Select a local folder and a language; for this demo, I will use C#. This will create a few files and folder. Now let's create our Function. From the extension menu, select the second option Create Function. Create a Blob Trigger named UnzipThis into the folder we just created, and select (or create) Resource Group, Storage Account, and location in your subscription. After a few seconds, another question will pop asking the name of the container that our blob trigger monitors. For this demo, input-files is used.

Once the function is created you will see this warning message.


What that means is that to be able to debug locally we will need to set the setting AzureWebJobsStorage to UseDevelopmentStorage=true in the local.settings.json file. It will look like this.

{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "UseDevelopmentStorage=true",
        "FUNCTIONS_WORKER_RUNTIME": "dotnet",
        "unziptools_STORAGE": "DefaultEndpointsProtocol=https;AccountName=unziptools;AccountKey=XXXXXXXXX;EndpointSuffix=core.windows.net",
    }
}

Open the file UnzipThis.cs; this is our function. On the first line of the function, you can see that the Blob trigger is defined.

[BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]Stream myBlob

The binding is attached to the container named input-files, from the storage account reachable by the connection "cloud5mins_storage". The real connectionString is in the local.settings.json file.

Now, let's put the code we need for our demo:

[FunctionName("Unzipthis")]
public static async Task Run([BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]CloudBlockBlob myBlob, string name, ILogger log)
{
    log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name}");

    string destinationStorage = Environment.GetEnvironmentVariable("destinationStorage");
    string destinationContainer = Environment.GetEnvironmentVariable("destinationContainer");

    try{
        if(name.Split('.').Last().ToLower() == "zip"){

            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(destinationStorage);
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference(destinationContainer);
            
            using(MemoryStream blobMemStream = new MemoryStream()){

                await myBlob.DownloadToStreamAsync(blobMemStream);

                using(ZipArchive archive = new ZipArchive(blobMemStream))
                {
                    foreach (ZipArchiveEntry entry in archive.Entries)
                    {
                        log.LogInformation($"Now processing {entry.FullName}");

                        //Replace all NO digits, letters, or "-" by a "-" Azure storage is specific on valid characters
                        string valideName = Regex.Replace(entry.Name,@"[^a-zA-Z0-9\-]","-").ToLower();

                        CloudBlockBlob blockBlob = container.GetBlockBlobReference(valideName);
                        using (var fileStream = entry.Open())
                        {
                            await blockBlob.UploadFromStreamAsync(fileStream);
                        }
                    }
                }
            }
        }
    }
    catch(Exception ex){
        log.LogInformation($"Error! Something went wrong: {ex.Message}");

    }            
}

UPDATED: Thanks to Stefano Tedeschi who found a bug and suggested a fix.

The source of our compressed file is defined in the trigger. To define the destination destinationStorage and destinationContainer are used. Their value are saved into local.settings.json. Then because this function only supports .zip file a little validation was required.

Next, we create an archive instance using the new System.IO.Compression library. We then create references to the storage account, blob, and container. It not possible to used second binding here because for one archive file you have a variable number of potential extracted files. The bindings are static; therefore we need to use the regular storage API.

Then for every file (aka entry) in the archive the code upload it to the destination storage.

Deploying


To deploy the function, from the Azure Function extension click on the third option: Deploy to Function App. Select your subscription and Function App name.

Now we need to configure our settings in Azure. By default, the local.setting are NOT used. Once again the extension is handy.


Under the subscription expand the freshly deployed Function App AzUnzipEverything, and right-click on Application Settings. Use Add New Setting to create cloud5mins_storage, destinationStorage and destinationContainer.

The function is now deployed and the settings are set, now we only need to create the blob storage containers, and we will be able to test the function. You can easily do that directly from the Azure portal (portal.azure.com).

You are now ready to upload a file into the input-files container.

Let's Code Together


This first iteration only supports "Zip" files. All the code is available on GitHub. Feel free to use it. If you would like to see or add support for other archive types join me on GitHub!.

In a video, please!


I also have a video of this post if you prefer.



I also have an extended version where I introduce more the Visual Studio Extension to work with Azure Function. And explain more details about the Azure Function V2.

Reading Notes #366

Cloud


Programming


Databases


Miscellaneous



Reading Notes #358

CakeLogoCloud


Programming


Miscellaneous


~

Reading Notes #354

Cloud


Programming


Books

Extreme Ownership_coverExtreme Ownership: How U.S. Navy SEALs Lead and Win (Jocko Willink, Leif Babin) - Very interesting book. Yes, it contains a lot of battle details, and first I was not sure, but then things "fall" all in place when you understand what the story was "demonstrating." It also contains more business focus examples. Everything is very clear, well explained in plain English.









~

Reading Notes #352

Suggestion of the week


Cloud


Programming


Data


Miscellaneous


Books

  • The Personal MBA: Master the Art of Business (Josh Kaufman) - An interesting book that helps to get in the mood, get prepared, and maybe for some whom weren't sure yet about the idea of a personal MBA (compare to the regular one)... Help to start planning and get moving. This book isn't an one book miracle MBA certification, but most likely a really good way to understand the journey the reader is about to start. The complete list of books to achieve this adventure is constantly updated and is available online. 


~

How to create an Azure Container Instance (ACI) with .Net Core

For a project I just started, I need to create Azure resources from code. In fact, I want to create an Azure Container Instance. I already know how to create a container from Logic Apps and Azure CLI/PowerShell, but I was looking to create it inside an Azure Function. After a quick research online, I found the Azure Management Libraries for .NET (aka Fluent API) a project available on Github that do just that (and so much more)!
In this post, I will share with you how this library work and the result of my test.

The Goal


For this demo, I will create a .Net Core console application that creates an Azure Containter Instance (ACI). After it should be easy to take this code and migrate to an Azure Function or anywhere else.

hello-container

The Console Application


Let's create a simple console application with the following command: dotnet new console -o AzFluentDemo cd AzFluentDemo dotnet add package microsoft.azure.management.fluent The last command will use the nuget package available online an add it to our solution. Now we need a service principal so our application could access the Azure subscription. A since way to create one is the use Azure CLI az ad sp create-for-rbac --sdk-auth > my.azureauth This will create an Active Directory (AD) Service Principal (SP) and write the content into the file my.azureauth. Perfect, now open the solution, for this kind of project, I like to use Visual Studio Code so code . will do the work for me. Replace the content of the Program.cs file by the following code.

using System;
using Microsoft.Azure.Management.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent.Core;
namespace AzFluentDemo
{
    class Program
    {
        static void Main(string[] args)
        {
            string authFilePath = "/home/frank/Dev/AzFluentDemo/my.azureauth";
            string resourceGroupName  = "cloud5mins";
            string containerGroupName = "frank-containers";
            string containerImage  = "microsoft/aci-helloworld";
            // Set Context
            IAzure azure = Azure.Authenticate(authFilePath).WithDefaultSubscription();
            ISubscription sub;
            sub = azure.GetCurrentSubscription();
            Console.WriteLine($"Authenticated with subscription '{sub.DisplayName}' (ID: {sub.SubscriptionId})");
            // Create ResoureGroup
            azure.ResourceGroups.Define(resourceGroupName)
                .WithRegion(Region.USEast)
                .Create();
            // Create Container instance
            IResourceGroup resGroup = azure.ResourceGroups.GetByName(resourceGroupName);
            Region azureRegion = resGroup.Region;
            // Create the container group
            var containerGroup = azure.ContainerGroups.Define(containerGroupName)
                .WithRegion(azureRegion)
                .WithExistingResourceGroup(resourceGroupName)
                .WithLinux()
                .WithPublicImageRegistryOnly()
                .WithoutVolume()
                .DefineContainerInstance(containerGroupName + "-1")
                    .WithImage(containerImage)
                    .WithExternalTcpPort(80)
                    .WithCpuCoreCount(1.0)
                    .WithMemorySizeInGB(1)
                    .Attach()
                .WithDnsPrefix(containerGroupName)
                .Create();
            Console.WriteLine($"Soon Available at http://{containerGroup.Fqdn}");
        }
    }
}

In the first row, I declare a few constants. The path of the service principal created earlier, resource group name, the container group name, and the image I will use. For this demo aci-helloworld. Then we get access with the Azure.Authenticate. Once we got access, it's y easy and the intellisense is fantastic! I don't think I need to explain the rest of the code as it already self-explanatory.

Got an Error?


While running you main in contour an error message complaining about the namespace not being registered or something like that ( I'm sorry I did not note the error message). You only need to register it with the command:

az provider register --namespace Microsoft.ContainerInstance

It will take a few minutes. To see if it's done you can execute this command:

az provider show -n Microsoft.ContainerInstance --query "registrationState" 

Wrap it up


And voila! If you do a dotnet run after a minute or two, you will have a new web application running inside a container available from http://frank=containers.eastus.azurecontainer.io. It's now very easy to take that code and bring it to an Azure Function or in any .Net Core Application that runs anywhere (Linux, Windows, Mac Os, web, containers, etc.)!


In a video, please!


I also have a video of this post if you prefer.



References




~

Reading Notes #351

MVIMG_20181111_190706

Cloud


Programming


Data


~


Reading Notes #344

CI-CD

Suggestion of the week


Cloud


Programming


Books

Five_cover
The Five Dysfunctions of a Team: A Leadership Fable (Patrick Lencioni) - I really enjoyed this book. The fact the first the material was passed as a story adds a lot of perspective and to our comprehension. In the last chapter the author return to the theories and gives more details. I completely devour that book; I'm looking forward to reading more.


Miscellaneous


~Enjoy


Reading Notes #340

fan-out-fan-in

Cloud


Programming


Integration


Miscellaneous


Reading Notes #335

IMG_20180622_184715Suggestion of the week


Cloud


Programming


Data


Miscellaneous



Reading Notes #334

canadaflag

Suggestion of the week

  • HTTPS Is Easy! (Troy Hunt) - A wonderful series of 4 videos that explains how to get secure with https. A must!

Cloud


Programming


Miscellaneous


Reading Notes #332

IMG_20180616_101111

Cloud


Programming


Books

  • [Invisible Ink: A Practical Guide to Building Stories That Resonate] (Brian McDonald)  - We all know it, a story is the element that will give that little plus to our post, and video. This short book explains how to really make an effective one talking about the not visual things...
    Really interesting.

    ISBN 0984178627 (ISBN13: 9780984178629)

Reading Notes #328

Cloud

  • 10 Reasons to Use Durable Functions (Mark Heath) - To celebrate his new course about durable function, Mark shares with us his top10 of the best reason with should use durable functions.

Programming


Data

  • Power BI Desktop May Feature Summary (Amanda Cofsky) - The monthly updates is always a great new. This month shows more about that new Q&A feature... You may not know about it, but you really want to use it...
  • Data Encodings and Layout (Clemens Vasters) - Very useful and deep article that provides the best practices for data encoding for different type of situation.

Books

Exactly What to Say, The Magic Words for Influence and Impact
(Phil M. Jones)
I listen to this audio book and really enjoy it. Simple powerful key works selection tat helps us to get where we want to go. It was only about two hours long and I listen to it in one shoot. And I’m mostly certain I will listen to it again.
ISBN 9780692881958



Miscellaneous



Reading Notes #327

Cloud


Programming


Databases


Miscellaneous


Books

  • The Subtle Art of Not Giving a F*ck (Mark Manson) -Damn it's good!
    The title of the book let's me thought it will be very negative. Not giving a fu#*... But it's really not. Quite the opposite in fact. I really like the book and I'm planning to read/listen it another time in... One year. To see what changed.

Reading Notes #326

retro-1291738_640Cloud


Programming


Miscellaneous


Books



Start with Why: How Great Leaders Inspire Everyone to Take Action
Author: Simon Sinek
- Fantastic book. To read a read-again to really capture and validate our path, our why.














Reading Notes #325

can-chat-chatting-362Suggestion of the week



Cloud



Programming



Data



Miscellaneous


Books




Reading Notes #324

Cloud

IMG_20180421_092215


Programming



Miscellaneous



Books


Eat That Frog!: 21 Great Ways to Stop Procrastinating and Get More Done in Less Time by [Tracy, Brian]Eat That Frog!: 21 Great Ways to Stop Procrastinating and Get More Done in Less Time

Author: Brian Tracy

A short book that pushes to action. I really enjoyed it. A book to read and read again.

ASIN: B01MYEM8SZ








Reading Notes #318 - MVP Summit 2018

CDNMVP2018

Last week, it was the 25 edition of the MVP Summit. An event, where Microsoft invites all his MVP to get to Seattle and spend some time with the products teams and learn on the latest news and best practices.

This year was particularly inspiriting by the Microsoft roadmap, of course,  but even more by all the amazing people a got the chance to meet and discuss with. I'm all pump-up, and I have tons of ideas and projects… more to come.

I already miss you…

Cloud


Programming