Showing posts with label post. Show all posts
Showing posts with label post. Show all posts

What did you say you are, an MVP?


A few years ago, I decided to put myself in a situation where I’d be in contact with more people. Of course, I got a lot of technical questions, but also, and more often than you may think, people ask me about my MVP title. What’s an MVP? What do you need to do to become one? What do you do when you are an MVP? What do you get as an MVP? You can always come see me and ask me; it would be a pleasure to speak with you. However, for those who want to stay in the shadows, or are not sure how to ask, I decided to write a blog post about it.


What is MVP?

MVP_Logo_Preferred_Cyan300_RGB_300ppiIn short, MVPs are Microsoft Most Valuable Professionals (MVPs). For more than twenty years, Microsoft has given that award to technology experts who passionately share their knowledge with the community. There are close to 4,000 MVPs around the globe specializing in approximately seventeen different categories. MVPs share their passion by writing blog posts, articles, and books, answering questions in forums, and giving presentations in community groups and events.

How I Became an MVP

Firstly, I didn’t do anything to become an MVP. I became an MVP because of what I was doing. And that’s how I think it should be approached.

I’ve always read a lot about new technologies and enjoy working on home “pet projects”. I find that the optimal way to learn is by doing whenever possible. Having real struggles with “features” are the best brain push-up you can get. Because things are changing quickly, and because I was playing with a large variety of technologies, I was keeping notes on my work. First, my notes were on a disk; then I put them online. From time to time, when someone would ask me a question at work, I would search through my notes. Although my online notes were public, the URL was purposely not a friendly one. Before long, co-workers started to ask me if they could have the URL to my notes and I knew it was time to take the first real step into the light.

I still remember clearly that talk I had with Dom, an "Eagle" (tech prime) at the office. I was shy but he told me to give it a try; some people were already liking what I was sharing, and after all, I was doing the work for myself anyways so why not share it with others? Since getting out of your comfort zone is a good way to learn, I decided to start my blog. And just to add a level of difficulty, I decided to do it in English since at the time I only knew French.

I started my blog, FrankysNotes.com, and changed all my funny avatars to my photo. Since starting my blog, I publish my reading notes every Monday. And exactly like the saying goes, “build it, and they will come.” The weeks passed and the number of weekly views on my blog grew from dozens to hundreds. Encouraged, I decided to publish the notes from one of my home projects. Immediately the number of views jumped. I was trying to write more, but it was a very long process since I was definitely not fluent in English. However, every time I published a post, I was really proud. Not only had I successfully finished a project and learned something new, I had also written a post that would help others learn too. At the same time, I joined the Canadian Windows Azure Community Experts group and participated in events.

Sharing my passion through presentations, blog posts, and meetups was enough for me to be welcomed into the big MVP family.

CanadianMVPFamilly

What I’ve Been Up to Since Becoming an MVP

Having been recognized as an MVP didn’t change me. I’m still very nervous before a presentation or publishing a blog post. I continue to code late at night on personal projects. More than ever, I’m excited when after a long battle with a bug, I finally find the way to make everything work.
What is different is that I now have more opportunities. All those friends reaching out to me (and other times it’s me reaching out to them) to ask a quick question or invite me to join them at an event. It’s up to me to embrace those challenges, get out of my comfort zone and continue to share my passion and my journey through the technology I like so much. I’ve started a second blog, in French this time, called CloudenFrancais.com. I’ve also joined the MSDEVMTL (Cloud) community group in Montreal, Quebec, Canada as a co-admin and started a new community in Ottawa, Ontario, Canada called OttCloudTech.

Thank you for reading my blog posts and asking questions. Next time you have the chance, reach out to me, it would be terrific to finally meet you.
See you soon.


References

How I use Azure Logic App, API App and Function App in my life

Why should we do things over and over? I know I don't want to lose my time doing so. Every time I see myself doing something I already did, I look for optimization. Sometimes, it's just little things like trying a different path to get from the train station to the office, and the other time, it's like drinking the rainbow juice directly from the source.

In this post, I will share with you how I optimized a three hours process into a two... minutes automated customizable solution using Azure Logic App and API App.

Quick Context

For the past two hundred fifty something weeks every Monday I share my reading notes. Those are articles that I read during the previous week with a little comment; my 2 cents about it. I read all those books, articles and posts on my e-book reader; In this case a Kindle whitepaper. I use an online service call Readability that can easily clean (remove everything except the text and useful images) the article and send it to my e-book reader. All the articles are kept as a bookmark list.

SendToKindle

When I have time I read what's on my list. Each time I'm done, I add a note ended by tags between square brackets, those will be use later for research and filter.

OnlyNotewithTags

Every Sunday, I extract the notes, then search in Readability to find back the URL and build with all this my post for the next morning. A few years ago, I did a first optimization using a Ruby script and I blogged about it here. While it's been working very well all those years, I wanted to change it so I do not have to install Ruby and the Gems on every machine I use.

Optimization Goals

Here is the list of things I wanted for this new brew of the Reading notes post generator:
  • No install required
  • not device or service couple.
  • generate json verrsion for future purpuse
  • cheap

Optimization Plan

Azure Logic App is the perfect candidate for this optimization because it doesn't require any local installation. Since it a flow between connectors any of them can be changed to please the user. In fact, that was the primary factor that made me picked Azure Logic App. For example, i the current solution, I'm using OneDrive as an initial drop zone. If you your environment you would prefer using DropBox instead of OneDrive, you just need to switch connectors, and nothing else will be affected. Another great advantage is that Azure Logic App is part of the App service ecosystem so all those components are compatible.

Here is the full process plan, at the high level.

ReadingNoteLogicApp_blog

  1. Drop the My Clippings.txt file in an OneDrive folder.
  2. Make a copy of the file In Azure Blob Storage, using the Blob Storage built-in connector.
  3. Parse the My Clippings.txt file to extract all the clippings (notes) since last extraction, using my custom My Clippings API App.
  4. For each note,
    1. Get the URL where that post is coming from, using my custom Readability Api App.
    2. Extract the tags and category (first tag is used as a category), using my custom Azure Function.
    3. Serialize the note with all the information in json.
    4. Save to Azure Blob Storage the json file in a temporary container.
  5. Generate a summary of all json files using my last custom ReadingNotes API App. It also saved a json and Markdon files of the summary.
Note: The summary could be published directly, but I decided to keep that last step manual, so I can check for typos and Grammar.

Let's see more in details

Many great tutorials are already available about how to create Azure Logic App or Azure API App, so in this post, I won't explain how to create them, but mostly share with you some interesting snippets or gotchas I encounter during my journey.

In the next screen capture, you can see all the steps contained in my two Logic App. I decided to slip the process because many tasks needed to be done for every note. The main loop (on the left) fetches the notes collection and generates the output. For every note, it will call the Child Logic App (on the right).


logicapp-Overview

The ReadingNotes Builder contains:
  1. Initial Trigger when a file is created in an OneDrive folder.
  2. Create a copy of the file in an Azure Blob storage.
  3. Delete the file in OneDrive.
  4. Get the configuration file from Azure blob storage.
  5. Call the API App responsible for parsing the file and extract the recent clippings.
  6. Call the child Logic App for every clipping returned in the previous steps (foreach loop).
    • A. Trigger of the Child Logic App.
    • B. Call the API App responsible for searching in the online bookmark collection and return the URL of the article/ post.
    • C. Call Azure Function App responsible for extracting tags from the note.
    • D. Call Azure Function App responsible for converting the object in json.
    • E. Saves the json object in a file in Azure blob storage.
    • F. Gets the updated configuration file.
    • G. Call Azure Function App responsible for keeping the latest note date.
    • H. Update the configuration with the latest date.
    • I. Return an HTTP 200 code, so the parent Logic App knows the work is done.
  7. Call the API App responsible for building the final markdown file.
  8. Save the markdown file to an Azure blob storage that was returned at the previous step.
  9. Update the config final.

As an initial trigger, I decided to use OneDrive because it's available from any devices, from anywhere. I just need to drop the MyClippings.txt file in the folder, and the Logic App will start. Here, how I configured it in the editor:


logicappMain-InitialTrigger

MyClippings API App


Kindle's notes and highlights are saved in a local file named My Clippings.txt. To be able to manipulate my notes, I needed a parser. Ed Ryan created the excellent [KindleClippings][KindleClippings], an open-source project available on github. In the actual project, I only wrapped that .Net Parser in an Azure API App. Wrapping the API in an API App will help me later to manage security, metrics, and it will simplify the connection in all other Azure solutions app.

I won't go into the detail on how to create an Azure API App, a lot of great documentation is available online. However, you will need at least the Azure .Net SDK 2.8.1 to create an API App project.

CreateAPIApp_step1


Swagger API metadata and UI are a must so don't forget to un-comment that section in the SwaggerConfig file.


CreateAPIApp_step2


I needed an array of notes taken after a specific date. All the heavy work is done in [KindleClippings][KindleClippings] library. The ArrayKindleClippingsAfter method gets the content of the My Clippings.txt file (previously saved in Azure blob storage), and pass it to the KindleClippings.Parse method. Then using Linq, only return the notes taken after the last ReadingNotes publication date. All the code for this My Clippings API App is available on github, but here the method:


public Clipping[] ArrayKindleClippingsAfter(string containername, string filename, string StartDate)
{
    var blobStream = StorageHelper.GetStreamFromStorage(containername, filename);
    var clippings = KindleClippings.MyClippingsParser.Parse(blobStream);
    var result = (from c in clippings
                    where c.DateAdded >= DateTime.Parse(StartDate)
                    && c.ClippingType == ClippingTypeEnum.Note
                    select c as KindleClippings.Clipping).ToArray<Clipping>();
    return result;
}


Once the Azure API App is deployed in Azure it could be easily called from the Azure Logic App.


AddCustomAPIApp


A great thing to notice here is that because Azure Logic App is aware of the required parameters and remembers all the information from the previous steps (show in different color), it will be a very simple to configure the call.


logicappMain-GetTheClippings

Azure Logic App calling another Azure Logic App


Now that we have an array of note, we will be able to loop through each of them to execute other steps. When I wrote the App, the only possibility way to execute multiple steps in a loop, was to call another Azure Logic App. The child Logic App will by trigger by an HTTP POST and will return an HTTP Code 200 when it's done. A json schema is required to define the input parameters. A great tool to get that done easily is http://jsonschema.net.


logicappSub-InitialTrigger

Readability API App

Readability is an online bookmark service that I'm using for many years now. It offers many API to parse or search article and bookmarks. I found and great .Net wrapper on github called CSharp.Readability that was written by Scott Smith. I could have called directly the Readability from Logic App. However, I needed a little more so I decided to use Scott's version as my base for this API App.
From the clipping collection returned at the previous step, I only have the title and need to retrieve the URL. To do this, I added a recursive method call SearchArticle.

private BookmarkDetails SearchArticle(string Title, DateTime PublishDate, int Pass)
{
    var retryFactor = 2 * Pass;
    var fromDate = PublishDate.AddDays(-1 * retryFactor);
    var toDate = PublishDate.AddDays(retryFactor);
    var bookmarks = RealAPI.BookmarkOperations.GetAllBookmarksAsync(1, 50, "-date_added", "", fromDate, toDate).Result;
    var result = from b in bookmarks.Bookmarks
                    where b.Article.Title == Title
                    select b as BookmarkDetails;
    if (result.Count() > 0)
    {
        return result.First<BookmarkDetails>();
    }
    if (Pass <= 3)
    {
        return SearchArticle(Title, PublishDate, Pass + 1);
    }
    return null;
}

Azure Function: Extract tags

While most of the work was done in different API, I needed little different tools. Many possibilities but I decided to take advantage of the new Azure Function App. They just sit there waiting to be use! The ReadingNotes Builder uses three Azure Function App, let me share one of them: ExtractTags. An interesting part with function is that you can configure them to get triggered by some event or to act as Webhook.

CreateFunctionApp

To create a Function App as Webhook you can use one of the templates when you create the new Function. Or from the code editor in the Azure Portal, you can click on the Integrate tab and configure it.


SetWebHookonFunctionApp


Once it's done you are ready to write the code. Once again, this code is very simple. It started by validating the input. I'm expecting a Json with a node called note, then extract the tags from it and return both parts.


public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
    string tags;
    string cleanNote;
    string jsonContent = await req.Content.ReadAsStringAsync();
    dynamic data = JsonConvert.DeserializeObject(jsonContent);
    if (data.note == null ) {
        return req.CreateResponse(HttpStatusCode.BadRequest, new {
            error = "Please pass note properties in the input object"
        });
    }
    string note = $"{data.note}";
    int posTag = note.LastIndexOf('[')+1;
    if(posTag > 0){
        tags = note.Substring(posTag, note.Length - posTag-1);
        cleanNote = note.Substring(0,posTag-1);
    }
    else{
        tags = "";
        cleanNote = note;
    }
    return req.CreateResponse(HttpStatusCode.OK, new {
        tags = $"{tags}",
        cleanNote = $"{cleanNote}"
    });
}


Now, to call it from the Azure Logic App, You will need first to Show Azure Function in the same region to see your Function App. And simply add the input; in this case like expected a json with a node called note.


logicappSub-FunctionCall

What's next

Voila! This is a simple but real application. While I shared only a part of the code in this post, all of it is available on github. I also did a presentation at DevTeach and other details are present in those slides. Using Azure Logic App to build this application was really interesting, and easy. Now that some pieces are in place (up there), I will be able to grow my environment by adding more functionalities, an interface, more security, but that's for another post...



References

Everything we Should Know About the new Azure Usage And Billing (AUBI) Portal

(Ce billet en aussi disponible en français.)

If one image is worth a thousand words, then it's incredible the amount of information you have in Azure Usage And Billing (AUBI). This portal is a open-source project that has been announced a few weeks ago. In this post, I will share my first impressions about it.

Portal

The project is still young, but every alive. When I installed it, I had one or two minor issues, but by the time I wrote this post all of them were already fixed.

Where it is?

The Azure Usage And Billing site is not a website like portal.azure.com; it consists of a solution you need to deploy in an Azure subscription. It doesn't require to be the subscription you wish to monitor, just a subscription you have access. The solution contains: two web sides with both Application Insights and one also with webjobs, an SQL Database, a storage account and you will also need to deploy a Power BI report.

resourcesgroup

All of it can be easily deployed using the PowerShell script and Azure Resource Manager (ARM) template included. Only a few manual steps will be required. Hopefully, a very clear and completed documentation is available in video or written. Both present on the Github project page.

What can I do with it?

Once fully deployed, you will need to navigate to your instance of the Registration portal (ex: http://frankregistrationv12.azurewebsites.net) and register all the subscriptions you want. After the webjobs are finished bringing all the data, they will all be available in the Power BI Reports.
Power BI does an incredible work by showing all the information about your subscription(s). A very useful point here is that all information present in the dashboard is interactive! Whatever you select simply one or many subscriptions or only a specific category of Azure service, all the other tiles will be automatically adjusted.

Aubi_800

What's Next?

If it's not already done, I highly recommend installing the AUBI portal and start enjoying the detail of all that information available to you without any effort, and presented in such a beautiful way. For all the details about the prerequisites or the install procedure got to the Github project page.



Reference:



Create and Deploy .NET Core WebApp to Azure from Linux

(Ce billet en aussi disponible en français.)

The other day, I was glued to my PC, and I had spare time (yah, I know very unusual). Since .Net Core 1.0 was just released few days before, I decide to give it a try. To add an extra layer of fun in the mix, I decided to do it from my Ubuntu VM. In less than 15 minutes, everything was done! I was so impressed I knew I needed to talk about it. That's exactly what this post is about.

The preparation

Before we get started, it's important to know which version of Ubuntu you are using, because some commands will be slightly different. To know the version you are running you simply need to click the gear in the top right of the desktop screen and select About this Computer. In my case, since I'm using Ubuntu 14.04 LTS, I will be using command related to this version. If you are using a different version, please refer to .NET Core documentation.

ubuntu_version

Now we need to install .Net Core. Nothing more easy. Open a Terminal (Ctrl+Alt+T) and type those three commands:

# Setup the apt-get feed adding dotnet as repo
sudo sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet/ trusty main" > /etc/apt/sources.list.d/dotnetdev.list'
apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893

# Get the latest packages
apt-get update

# Install .NET Core SDK
sudo apt-get install dotnet-dev-1.0.0-preview2-003121
Once it's all done you can type dotnet --info and you should see something like that.

dotnet_info


Create the Local WebApp

From the same Terminal we will create an empty folder for our solution and move into it. Execute these commands.
mkdir demodotnetweb
cd demodotnetweb
We now want to create our new web project. This is done by the command dotnet new, but we need to specify the type otherwise it will create a console application.
dotnet new -t web
Now to download all the references (nuget packages) of our project required, execute the following command:
dotnet restore
Base on the speed of your Internet connection and how many dependencies are required, this can take from few seconds to more than one minute.
To test if our solution works locally type the command:
dotnet run
That will compile our solution and start an AspNetCore Internal hosting. Launch a web browser and go to http://localhost:5000 to see the App in action.

dotnetcore_localhost

Deploy to Azure

To deploy our new project to the cloud, we will use the continuous deployment capability of Azure. First, navigate to portal.azure.com and create a Web App.

create_webApp

Once the the application is created, you should see the This web app as been created message when you navigate to the [nameofWebApp].azurewebsites.net

successfully_created

It's now time to add a local Git repository. In the WebApp Settings select Deployment source. Click on the Configure Required Settings, then select the Local Git Repository option.

add_source_control

After setting the user credential for the repository, we can get the URL from the Essential section.

repourl

Back to our Ubuntu Terminal, we will have to execute these commands:

# create a git repository
git init
# commit all files
git commit -m "Init"

# Add the remote repository
git remote add azure https://username@demowebcore.scm.azurewebsites.net:443/demowebcore.git

# Push the code to the remote
git push azure master
After a minute or so you should have your WebApp online!

dotnetcore_azure


Voila! That was fun!.

Automating Docker Deployment with Azure Resource Manager

Recently, I had to build a solution where Docker container were appropriate. The idea behind the container is that once there are built you just have to run it. While it's true, my journey was not exactly that, nothing dramatic, only few gotchas that I will explain while sharing my journey.

The Goal

The solution is classic, and this post will focus on a single Virtual Machine (VM). The Linux VM needs a container that automatically runs when the VM starts. Some files first download from a secure location (Azure blob storage) then passed to the container. The solution is deployed using Azure resources manager (ARM). For the simplicity, I will use Nginx container but the same principle applies to any container. Here is the diagram of the solution.

Docker-in-Azure2

The Solution

I decided to use Azure CLI to deploy since this will be also the command-line used inside the Linux VM, but Azure PowerShell could do the same. We will be deploying an ARM template containing a Linux VM, and two VM-Extension: DockerExtension and CustomScriptForLinux. Once the VM is provisioned, a bash script will be downloaded by CustomScriptForLinux extension from the secure Azure blob storage myprojectsafe, then executed.

What's your plan B when the Azure Portal is not responding

(Ce billet en aussi disponible en français.)

You are about to test the last version of your solution. You just need to change some configuration in the Micosoft Azure Portal, and you are good to go. To do it, you log in the portal at http://portal.azure.com and navigate to your component and... Error! What you see is a little sad cloud.

OneSadCloud

The Problem

This just happened to the team I'm working with. They needed to change the traffic manager's endpoint to do a traffic test. Unfortunately, the grid that contains the Endpoint was in a bad status and was not available. It looked like a deadened!

ManySadClouds

But it is really? Of course not. Here what you can do.

The Solution

Remember Microsoft is sharing the same API the Azure portal is using. That the beauty of the Azure Portal, you can use it has a convivial way to do your what you need, or you can access it via many different SDKs that are available today: .Net, Java, Node.js, Php, Python, Ruby and more! You also have command line tools that could help to manage your Azure services and apps using scripts.
To know more about all the SDK available or the command-line refer to the Azure SDKs documentation pages online.

Remember

This time we were in a Windows environment, and we needed to modify one Endpoint of a Taffic Manager. Here what we did using Azure PowerShell Cmdlets:

# Login to our account
Login-AzureRmAccount

# Set the context we will work in. Use Get-AzureRmSubscription to list all your subscriptions. 
Set-AzureRmContext -SubscriptionName "MySubscriptionName"

# List All Traffic Manager Profile 
Azure Get-AzureTrafficManagerProfile 

# Load our endpoint in a variable, change the value we need and put it back.
$endpoint = Get-AzureRmTrafficManagerEndpoint -Name myendpoint -ProfileName myprofile -ResourceGroupName "MyResourceGroupName" -Type ExternalEndpoints
$endpoint.Weight = 50
Set-AzureRmTrafficManagerEndpoint -TrafficManagerEndpoint $endpoint


In this case, we used the Azure Resource Manager (ARM) commands, but all the commands are also available in the service mode. To know more about how to deploy with ARM you can read my previous post. To see all the command available supported with ARM to configure your solution, go see the documentation online.
Happy testing!


References:


Azure Resource Manager (ARM) for beginners

(Ce billet en aussi disponible en français: Azure Resource Manager (ARM) pour débutants)

You know that image where you see people pulling a cart with square wheels, a man on the side wants to show them a circle wheel but the group reply they are too busy to care... Well, that was me with Azure Resource Manager (ARM). I knew it was good, but it looks too complicated, so I was waiting. Last weekend, I decided it was enough I needed to learn it! Right after few minutes, you cannot imagine my disappointment! It's so simple, so powerful and also so fast. This post explains how to deploy an ARM template, and how it works.

squarewheels

5 easy steps to deploy our first ARM template


To get started the easiest way possible I decided to use Visual Studio. For this sample let's create simple Windows Virtual Machine (VM). Only five steps are required to do it:

Step #1 - Create an Azure Resource Project

From Visual Studio create a new project of type Azure Resource Group. Be sure to have already installed on your machine the latest version of Azure SDK and Visual Studio updates.

step-1-Create_arm_project.1

Step #2 - Select the Arm template

This is where we select what we want in our template. Many options are available in VisualStudio and a lot more can be found on Github at: Azure Arm Git Template. In our case, let's select the sample Windows Virtual Machine, and click the Ok button.

step-2-Select_Tempplate

Step #3 - Deploy the new template

Visual Studio will now generate multiple files, we will come back to it later, right now we will only deploy our solution. Right-click on the project et select Deploy.

step-3-Start_deploy

Step #4 - Configure the deployment

Our first deployment is mostly ready, we just need to specify few details like the subscription and the resource group. Once you click Deploy, one last thing will be asked: the adminPassword.

step-4-Config_deploy

Step #5 (the easiest one) - Enjoy

Voila! After few minutes, the virtual machine will be created, and we should be able to connect remotely to it.

Let's explain the magic


When the project was created, three folders were populated: Script, Template, and Tools. The last one is a bit obvious, it contains AzCopy, a tool to copy files. If you don't know AzCopy, you can learn more in a post I wrote recently.

Open the Deploy-AzureResourceGroup.ps1 contained in the Scripts folder. It's this script that will do all the hard lifting to deploy our rescourceGroup. If we look a bit closer, you will notice some parameters are declared, at the beginning of the file. Two of then should catch our attention.
    [string] $TemplateFile = '..\Templates\WindowsVirtualMachine.json',
    [string] $TemplateParametersFile = '..\Templates\WindowsVirtualMachine.parameters.json',

TemplateFile is the path of our template (the one we selected previously), and the second TemplateParametersFile will contained all the parameters values to fill the blank of our template. This will be especially useful to deploy the same template in a different environment. In fact, this is a really big advantage. You can deploy the exact same schema to your development and production environment just by having two parameters.json file.

Let's have a peek at the template, in this case WindowsVirtualMachine.json. It's a 'json' file, so it's human-friendly, but it can be a bit scary at first. In the image just below, I collapse the collections to be able to emphases the visibility of the three prime elements: parameters, variables, and resources.

jsonTemplate

We already know parameters, so let's jump the variables. This section contains a list of key pair value like: imagePublisher, vmSize, virtualNetworkName, diagnosticsStorageAccountName, etc. Those can be fixed value or dynamics by using other variables or parameters. Here some example:
    "vmSize": "Standard_A2"
    "vhdStorageName": "[concat('vhdstorage', uniqueString(resourceGroup().id))]"
    "virtualNetworkName": "[parameters('virtualNetworkName')]"

Last section but not least: the resources. This is where everything is put together to build the solution you will deploy. The resources are defined by specifying their type, name, and properties. You can assign any value from a static string, parameter value or a variable value.

Now that we know it works, why should we use it


Explain all the advantages to use ARM template could be a post by itself, and go further of the scope of that post. However, here few reasons:
  • A template file is light and easy to keep in a repository.
  • It's very simple to have the exact same template deployed in multiple environments.
  • ARM templates are really fast to deploy.
  • Easy to edit/ customize/ expand.
  • Easy to delete.

In Video Please


If you prefer, I also have a video version of this post.



What's Next


This post was voluntary very simple to be easy to understand. Now it's your turn deploy a modest schema or something more complex. Many different templates can be found at various places: Visual Studio, Azure Arm Git Template, and Azure Quickstart Templates. Great tools exist also to help you to visualize or edit these templates: Azure Resource Explorer and Download Azure Resource Manager Tools for VS Code

Resources:

~ Frank


A practical overview of Service Fabric

(Ce billet en aussi disponible en français: Un aperçu pratique de Microsoft Azure Service Fabric)

fabric
During the last Global Azure Bootcamp (April 16), the Montreal edition focused on Azure Service Fabric. The schedule for that day was packed with dynamic presentation and hand-on labs. Either because you were unable to be present or just because you want to learn more, it is not too late. The MsDevMtl team is happy to share with you all our documents we used.

Before you start

  • To be able to do the Hand on Labs you will need:
  • A PC with Windows 7, 8, 8.1 or 10
  • VisualStudio 2015 Update 1 or 2
  • Service Fabric SDK v2.0.135 (or more recent)
  • A Microsoft Azure subscription. If you don't already have one, here a free trial to get you started aka.ms/azuretrialmtl

The One-day Schedule

As you see the complete edition of the Azure Fabric Learning Path contains much more details and labs. Since we had only one day we decided to cherry picked what we thought was the best to get started, but feel free do more.

OneDayLearningPath

  • Intro to services and Service Fabric Overview
  • Labs 1 & 2
  • Application Packaging & deployment
  • Diner
  • Labs 3 & 4
  • MS at scale & High Availability
  • Lab 5
  • Diagnostics & Health policies
  • Lab 6
  • Who is using SF, AoA, Testing
  • Lab 7 & 8
  • Upgrades
  • Lab 9
  • Conclusion

Where to find everything you need


Questions?

All the presentations were done by Alexandre Brisebois (@Brisebois), Stepahne Laporte (@s_lapointe), myself (@fboucheros), and under the supervision of our fantastic bandmaster Guy Barette (@GuyBarrette). Feel free to ask us questions if you get blocked during your journey.




.

MVP Award 2016!

MVP2016_paque

Thank you

Since two years now, the first of April is very special for me. Not because of all the April fool, but because that the date you made me a Microsoft Azure MVP.

I'm very proud and happy to be renew this year again. I wanted to take time to thank you all. I will continue to write and talk about my passions!
See you soon...






Up-comming events



6 ways to go from Markdown to Azure Web App

Everything started when I wanted to share a blog post in progress to someone for review. I didn't want to create a copy, and I was looking for an extremely simple way to share; like an url. This blog post is about all my journey to find that method and all the great possibilities available. I was really happy to that Azure Web App.
I'm writing in Markdown, it's a syntax I really like because it's simple no special application is required to use it. To know more about it see my previous post: Why I switch to Markdown, First VSCode Tasks in less than 5 minutes and Meet my new best friend: Visual Studio Code. The more I use it, the more I like it. I started using it not only not only for blogging, but also for all kinds of notes.

DropBox

One very good thing about Markdown is the fact that is compatible with all platforms. Because of that, I keep my texts in Dropbox. Why not Google Drive or OneDrive? Because Dropbox automatically generates the HTML version so my reviewer could read it in a beautiful format. The file is share-able very easily and if authenticated my reviewer could write comment.
dropboxpostsharing
In the PRO version, of DropBox, you can give access only to specific user. That would be very nice for sharing files inside a business or more sensible information.
Unfortunately for me, I don't want to force my reviewer to register. Another interesting fact is that relative paths for images aren't supported. So all images/ charts need also to be share individually before added in the text.

Repositories: GitHub, Bitbucket, etc.

By default, most repositories convert markdown file to HTML so very easy to read. It's also a very good way to have a saved copy. But then you need to have a public repository or give access to people...
Only using repository was not good enough in my case because I don't wish to share unfinished work with everyone.

Jekyll


Option 1 - Jekyll

Jekyll is a static website generator written in Ruby. It's really well integrated to Github, and you can even host your blog in a Github repository. However, since I would prefer to keep my in progress work more private, I decided to go with Bitbucket. Bitbucket is a great repository that supports Git and Mercurial system and allowed private repositories.
We could have Jekyll in a Git repository host on Bitbucket that would be hook-up to an Azure Web App with a continuous deployment.
Here the steps:
  1. First create a private repository from Bitbucket.
  2. Clone that fresh repository on your local machine.
  3. Now it's time to create your Jekyll site.
    • If you don't have Ruby or Jekyll already installed on your machine now it's time. It's very easy just follow the instruction on the official website.
    • To create a new site, open a command prompt and type the command: jekyll new NameOfMySite then cd ./NameOfMySite and jekyll serve
    To see your new site you just need to browse to http:localhost:4000. Add your Markdown files to the folder _posts and be sure they respect the naming convention YYYY-MM-DD-Title.md
  4. Now it's time to add all the files to our Git repository with the command git add -A, and before pushing let create a new Azure Web App.
  5. Go to http://portal.azure.com a create a new Azure Web App.
    CreateAzureWebApp
    • From the top left click the "+ New" button.
    • Select Web+ Mobile, then click on Web App
    • Fill-up the name, subscription plan and click the create button.
  6. After few second, the Web App will be ready. It's time to add a continuous deployment to it.
    AddContinuous
    Note: that right now the deployment settings are FTP.
    • In the Web App blade, if not already go in the Setting section.
    • Scroll done the Settings to Continuous deployment and click on it.
    • Now choose your source control, in this case Bitbucket.
  7. It's now time to publish our site to our Remote repository with git push.
  8. In the Azure portal, you will see the deployment progress and history.
    DeploymentHistory
The combination Jekyll / Bitbucket / Azure Web App work great, but we need to generate the code locally and checked-in both source and generated content in the repository. Furthermore, since we need to generate the code, Ruby and Jekyll need to be installed on every machine we will be using.

Option 2 - Jekyll Extension to Azure Web App

I found a really great Azure Web App Extension Jekyll Extenstion on GitHub. That will simplify a lot the process thanks to Cory's works. To use it simple follow the four steps explain on the Github page:

  1. Create an Azure Web App 
  2. Set an App Setting for SCM_COMMAND_IDLE_TIMEOUT to 600. From the Web App blade click on Settings and select Application settings.  Add the new line, and click the save button.
     SCM_AppSetting
  3. Install the Jekyll Site Extension
    • Always From the Web App blade click on Tools, then select Extension
    • Click Add button
    • Found and select Jekyll Extension AddJekyllExtension
  4. Now we need to hook up your Git repository or Push a local (in Azure) Git repository with your Jekyll site.
I really liked this solution. It's very simple to install. Because it used a repository, I can keep a historic of all my texts. Moreover, only the texts and images are in the repository, and since the site is generated in the cloud, no need to install anything on other machines.
WriteFromVSCode
Directly from Visual Studio Code, I can write my article, and when I'm read I just need to do a push (still inside VSCode). The site will automatically be built and deployed in my Azure Web App.

Santra.Snow

NancylogoWhile doing my research, I found Sandra.Snow another static site generator inspired from Jekyll but in .Net using Nancy library.
To use it, a little bit of work is required. The easiest way is to fork the Github project and compile the solution to get dlls and exes.
  • Create a new folder for your site [MySnowSite].
  • In MySnowSite folder, create another folder Sandra.Snow.Processor and copy/paste: Nancy.dll, Nancy.Testing.dll, Nancy.ViewEngines.Razor.dll and Snow.exe generated previously.
  • You can now copy the Sandra.Snow/SnowSite/Snow folder into MySnowSite folder.
  • Add deployment and deploy.cmd files from Sandra.Snow/SnowSite into MySnowSite folder.
SandraSnowFolder
Few changes were required in deploy.cmd (line: 29, 31, 56, 57)
@echo off

:: ----------------------
:: KUDU Deployment Script
:: ----------------------

:: Setup
:: -----

setlocal enabledelayedexpansion

SET ARTIFACTS=%~dp0%artifacts

IF NOT DEFINED DEPLOYMENT_SOURCE (
  SET DEPLOYMENT_SOURCE=%~dp0%.
)

IF NOT DEFINED DEPLOYMENT_TARGET (
  SET DEPLOYMENT_TARGET=%ARTIFACTS%\wwwroot
)

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:: Deployment
:: ----------

:: 3. Build Snow Site
echo -----
echo Start - Building the Snow Site
echo Running Snow.exe config=%DEPLOYMENT_SOURCE%\Snow\
pushd %DEPLOYMENT_SOURCE%
call  %DEPLOYMENT_SOURCE%\Sandra.Snow.Processor\Snow.exe config=%DEPLOYMENT_SOURCE%\Snow\
IF !ERRORLEVEL! NEQ 0 goto error
echo Finish - Building the Snow Site
echo -----


IF NOT DEFINED NEXT_MANIFEST_PATH (
  SET NEXT_MANIFEST_PATH=%ARTIFACTS%\manifest

  IF NOT DEFINED PREVIOUS_MANIFEST_PATH (
SET PREVIOUS_MANIFEST_PATH=%ARTIFACTS%\manifest
  )
)

IF NOT DEFINED KUDU_SYNC_COMMAND (
  :: Install kudu sync
  echo Installing Kudu Sync
  call npm install kudusync -g --silent
  IF !ERRORLEVEL! NEQ 0 goto error

  :: Locally just running "kuduSync" would also work
  SET KUDU_SYNC_COMMAND=node "%appdata%\npm\node_modules\kuduSync\bin\kuduSync"
)


echo Kudu Sync from "%DEPLOYMENT_SOURCE%\Snow\Website" to "%DEPLOYMENT_TARGET%"
call %KUDU_SYNC_COMMAND% -q -f "%DEPLOYMENT_SOURCE%\Snow\Website" -t "%DEPLOYMENT_TARGET%" -n "%NEXT_MANIFEST_PATH%" -p "%PREVIOUS_MANIFEST_PATH%" -i ".git;.deployment;deploy.cmd" 2>nul
IF !ERRORLEVEL! NEQ 0 goto error

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

goto end

:error
echo An error has occured during web site deployment.
exit /b 1

:end
echo Finished successfully.
Like previously created a Azure Web App and hook up a Git repository or push to an Azure one. You can find a lot of information on the blog of Sandra.Snow creator Phillip Haydon.

Bonus

For both Jekyll (option 2) and Sandra.Snow that used Azure Web App continuous deployment use can use Dropbox instead of Git repository. Why would you use Dropbox? Well, since Dropbox is available on any kind of platform, you would be able to write from your iPad or android tablet, or anything! To learn more about how to do it, see one of my previous post: Setup an automatic deployment on Azure with Dropbox in 5 minutes.
Just for the fun, I created one theme for Sandra.Snow that I put on GitHub: Sandra.Snow.NotesTheme, feel free to use it.

Enjoy!

~Frank Boucher

Simple as Azure Marketplace

It happens to all of us, we need to have something done, and we needed for yesterday. In these times, we struggle, we don’t know where to start or worse, what to do. My grand-mother use to say: “When I don’t know what to cook, I always start by a sauté of onions. It’s smell so good it helps me to get started." In this post, I will show you my “sauté of onions” tips to get an environment up and running quickly.

Everybody knows that when you are looking for something for your mobile device you just need to go the  “App Store”. But did you know Microsoft Azure has his own store? It’s called the Azure Marketplace .

Your online store for thousands of certified, open source, and
community software applications, developer services, and
data—pre-configured for Microsoft Azure. Download, deploy, and get
more done.
It's not containing two or three hundred of items but more than three thousand five hundred. Yes, that right, more than 3500! Whether you need a virtual machine, a web application, or a web API, great chance it will be available in the Marketplace. And it’s still continued to grow day after day.

It’s easy to think that we can “pop” and WordPress website in less then 5 minutes. It’s also true that creating a brand-new virtual machine, with a vanilla Windows server or Linux, is only few clicks. Moreover, many much more complex solutions could be created in the same way.

1, 2, 3, CommVault Simpana

Recently, I needed to find a fast and easy way to create a solution to provide data management that is easily accessible regardless of location. A quick search in the Azure Marketplace shows me all the different options I had. I know that  Simpana is a great product so let’s use this one.

AzureMarketSearch

When an item is selected from the search list in the Azure Marketplace, the detail view is presented. This page contains all the information: prices, sizes, documentation and references.
Let’s start! Press the big green “Create Virtual Machine” button on the top of the screen. That will open the Microsoft Azure Portal with the blade ready to create your Simpana. An active Azure subscription is required, if you don’t have one get started it one-month Free here. At the bottom of the page, you will need to select the deployment mode. I strongly suggested to select the Resource Manager, because it provides more flexibility to manage the resource once you have it created. When you are ready click the Create button.

Step_1-3_create

The first and second step, ask about the basics information: name, subscription, resource group, location and the size. The third step should be already populated base on the information you entered, but feel free to change it if you wish. The two next steps are just to be sure you understand the billing, and a summarize de new deployment.

commvault-_RG

When all the steps are completed, the portal will start the deployment of our solution. After few minutes, it should be done, and if you open the resource group, we can see all the deployed and configured items.

Simpana_and_Console_Applications

By selecting the Virtual Machine in the resource group, it will provide a Connect option. Click on it  to download a Remote Desktop connection. Once connected, click the Start Menu, and you will find Simpana ready to serve.

The Azure Marketplace is a really important and powerful tool. It will help to create simple or complex solution easily and in only few minutes. A good way to improve our productivity.
References