Showing posts with label git. Show all posts
Showing posts with label git. Show all posts

How to Create Your Custom Artifacts for DevTest Labs

Most of the time when we use an Azure Devtest Lab it to Test our own application. This means that will need to install them on the virtual machines, every time. To do that, we need to create a custom artifact and add it to our formulas or to our claimable VMs. Lucky for us, creating a custom artifact is much easier than you may think. In fact, this post I will show you how easy it can be.

Goal

I want to create an artifact available from a private repository (Git from dev.azure.com in this case) that will set the timezone inside the VM.

Getting started

First, let's use a section in the Azure portal that is very useful; the Get Started section. In the portal navigate to your DevTest Lab (1), and select the Getting Started option from the left menu bar (2). In this new bar scroll down to the Lear more area and select Sample artifacts and scripts (3).

GetStarted
That will open the DevTestLab artifacts, scripts and samples project from Azure on Github. Open the folder Artifact, to see the list of all the usual artifacts you find in the public repo that is available by default in the portal.

Notice how all artifacts are in their own folder. When you create a new artifact, you can always come here and pick something similar to what you are trying to do. This way, you won't start from scratch. Let's open windows-vsts-download-and-run-script. An artifact is defined in the file Artifactfile.json. This file is mandatory and cannot be renamed. You can put scripts, images, or anything else you need inside this folder.

Open the Artifactfile.json file and have a look.

ArtifactfileSample
As you can see it's a simple JSON file. In the section (A) you will define the title, description, publisher, OS and the Icon. Note that the Icon must be accessible publicly, it could be on github, a blob storage or on a website. Section (B) is to define all the parameters you may need to install your artifact on the VM. Finally In (C) it's the command to execute.

Create the Artifact

Here is the JSON for our windows-Set-TimeZone artifact. A made it very static by not passing any parameter, but in a reel situation, a timezone parameter would be better.
Artifactfile.json
{
    "$schema": "https://raw.githubusercontent.com/Azure/azure-devtestlab/master/schemas/2016-11-28/dtlArtifacts.json",
    "title": "Set TimeZone to Eastern Standard Time",
    "description": "Execute tzutil command on the VM set set the Time Zone",
    "publisher": "FBoucher",
    "tags": [
        "PowerShell"
    ],
    "iconUri": "https://raw.githubusercontent.com/Azure/azure-devtestlab/master/Artifacts/windows-run-powershell/powershell.png",
    "targetOsType": "Windows",
    "parameters": { },
    "runCommand": {
        "commandToExecute": "tzutil.exe /s \"Eastern Standard Time\""
    }
}

Create an artifact repository

For this post, I'm using Git from Azure Devops (dev.azure.com) previously named VSTS, but any private repository should works. If it's not already done create a project and go to the Repos section. Create a root folder named Artifacts or something else if you prefer. Then add a new folder for your artifact. To follow the best practices you should start with the name of your artifact by the name of the targeted OS; in my case windows-Set-TimeZone. Now add the file Artifactfile.json defined previously.

Note the url of the repository, it should be easy to get it by click on the Clone button that is on the top right of the screen.

clone

Add repository to DevTest Lab

Now we need to add this repository to our Devtest Labs. From the portal.azure.com, open the blade of your lab. From the left panel, click on Repository, then click the Add button.

CreateRepo
It's time to use the information noted previously. This is about the Repository, not the artifact.

Use the Artifact

The only thing left is to use our artifact. You can find it while creating a VM or a formula. When you have parameters define in your Artifactfile.json, the parameters will be listed in a form completly a the left.
artifactList
And if you try it, you will see that the time match the desired timezone. Here my PC is set to display with a format of 24H put it's the same... yep I'm in Eastern Standard Time.

voila

Add it to an ARM template

Doing it with the nice interface is good when you are learning. However, we all know that no DevOps will do that manually every time. So let's add our Repository to our ARM template. If you need more detail on the deployment method, I explain it in a previous post How to be efficient with our Azure Devtest Lab deployments.

When you don't know the type or the structure of a resource, you can always go in the Resource Explorer (resources.azure.com) there will be able to find your resource and see how it's defined.

ResourceExplorer
So for this post our artifactsources will look like this:
{
    "properties": {
        "displayName": "Cloud5mins",
        "uri": "https://fboucher.visualstudio.com/DefaultCollection/Cloud5minsArtifacts/_git/Cloud5minsArtifacts",
        "sourceType": "VsoGit",
        "folderPath": "/Artifacts",
        "armTemplateFolderPath": "",
        "branchRef": "master",
        "securityToken": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
        "status": "Enabled"
    }, 
    "name": "Cloud5minsRepo",
    "type": "Microsoft.DevTestLab/labs/artifactsources"
}
An artifactsources goes in the Resources list inside the Devtest Labs.

ARM
In an ARM template you have the main node Resources (A), then you will have the Lab node (B). Inside this node, you should see second resources list (C), where the Virtual Network is defined. The artifactsources should go there.

Then when you declare your formula, you just need to reference this repository, exactly like the public one.

In a video, please!

I also have a video of this post if you prefer.



Reference:

Reading Notes #344

CI-CD

Suggestion of the week


Cloud


Programming


Books

Five_cover
The Five Dysfunctions of a Team: A Leadership Fable (Patrick Lencioni) - I really enjoyed this book. The fact the first the material was passed as a story adds a lot of perspective and to our comprehension. In the last chapter the author return to the theories and gives more details. I completely devour that book; I'm looking forward to reading more.


Miscellaneous


~Enjoy


Reading Notes #343

2018-09-02_20-42-43Cloud


Programming


Reading Notes #337

IMG_20180707_220101

Cloud



Programming



Data



Miscellaneous




Does the Azure DevOps projects are worth it?

Imagine you just arrived at the office. You only took a sip or two of your coffee or tea. You look at the tasks that need to be done today (well yesterday based on the request): a new project is starting, and you need to configure everything the team needs to start building that web application. The need a repository, a continuous integration and continuous delivery (CI/CD) pipeline, a place to deploy, monitoring tools, and of course you need to create an environment where they will be able to track their work. Should you panic? No, because you will use the new Azure DevOps Project available in Azure.

Let's Create the project


From the Azure portal (portal.azure.com) click on the plus button and search for "devops". Select DevOps Project, then click on the Create button. Then follow the five steps and Azure will create everything for you.

What is deployed


  • Your application from many popular frameworks
  • Automatic full CI/CD pipeline integration
  • Monitoring with Application Insights
  • Git Repository
  • Tasks/ Bugs tracking board
  • Deployment to the platform of your choice


In Video please!




Conclusion


The DevOps projects are really fantastic and are very useful. The fact that everything is all packaged together and automatically deployed is a considerable time saver. In short, are the Azure DevOps projects worth it? Oh yeah!

Reading Notes #324

Cloud

IMG_20180421_092215


Programming



Miscellaneous



Books


Eat That Frog!: 21 Great Ways to Stop Procrastinating and Get More Done in Less Time by [Tracy, Brian]Eat That Frog!: 21 Great Ways to Stop Procrastinating and Get More Done in Less Time

Author: Brian Tracy

A short book that pushes to action. I really enjoyed it. A book to read and read again.

ASIN: B01MYEM8SZ








Reading Notes #323

Suggestion of the week


Cloud


Programming


Miscellaneous


Databases


Books


This book doesn't age! 

This book may be old, but it's still incredibly true. I loved the way the reader was speaking and the rich vocabulary. It's definitely a must.

ASIN: B003WEAI4E







Reading Notes #309

419HCloud


Programming


Databases


Miscellaneous



Reading Notes #276

roslynSuggestion of the week


Cloud


Programming


Miscellaneous

Reading Notes #270

canada-mapSuggestion of the week


Cloud


Databases


Miscellaneous



Reading Notes #269

AzureFunctionSuggestion of the week


Cloud


Programming


Miscellaneous



How I use Azure Logic App, API App and Function App in my life

Why should we do things over and over? I know I don't want to lose my time doing so. Every time I see myself doing something I already did, I look for optimization. Sometimes, it's just little things like trying a different path to get from the train station to the office, and the other time, it's like drinking the rainbow juice directly from the source.

In this post, I will share with you how I optimized a three hours process into a two... minutes automated customizable solution using Azure Logic App and API App.

Quick Context

For the past two hundred fifty something weeks every Monday I share my reading notes. Those are articles that I read during the previous week with a little comment; my 2 cents about it. I read all those books, articles and posts on my e-book reader; In this case a Kindle whitepaper. I use an online service call Readability that can easily clean (remove everything except the text and useful images) the article and send it to my e-book reader. All the articles are kept as a bookmark list.

SendToKindle

When I have time I read what's on my list. Each time I'm done, I add a note ended by tags between square brackets, those will be use later for research and filter.

OnlyNotewithTags

Every Sunday, I extract the notes, then search in Readability to find back the URL and build with all this my post for the next morning. A few years ago, I did a first optimization using a Ruby script and I blogged about it here. While it's been working very well all those years, I wanted to change it so I do not have to install Ruby and the Gems on every machine I use.

Optimization Goals

Here is the list of things I wanted for this new brew of the Reading notes post generator:
  • No install required
  • not device or service couple.
  • generate json verrsion for future purpuse
  • cheap

Optimization Plan

Azure Logic App is the perfect candidate for this optimization because it doesn't require any local installation. Since it a flow between connectors any of them can be changed to please the user. In fact, that was the primary factor that made me picked Azure Logic App. For example, i the current solution, I'm using OneDrive as an initial drop zone. If you your environment you would prefer using DropBox instead of OneDrive, you just need to switch connectors, and nothing else will be affected. Another great advantage is that Azure Logic App is part of the App service ecosystem so all those components are compatible.

Here is the full process plan, at the high level.

ReadingNoteLogicApp_blog

  1. Drop the My Clippings.txt file in an OneDrive folder.
  2. Make a copy of the file In Azure Blob Storage, using the Blob Storage built-in connector.
  3. Parse the My Clippings.txt file to extract all the clippings (notes) since last extraction, using my custom My Clippings API App.
  4. For each note,
    1. Get the URL where that post is coming from, using my custom Readability Api App.
    2. Extract the tags and category (first tag is used as a category), using my custom Azure Function.
    3. Serialize the note with all the information in json.
    4. Save to Azure Blob Storage the json file in a temporary container.
  5. Generate a summary of all json files using my last custom ReadingNotes API App. It also saved a json and Markdon files of the summary.
Note: The summary could be published directly, but I decided to keep that last step manual, so I can check for typos and Grammar.

Let's see more in details

Many great tutorials are already available about how to create Azure Logic App or Azure API App, so in this post, I won't explain how to create them, but mostly share with you some interesting snippets or gotchas I encounter during my journey.

In the next screen capture, you can see all the steps contained in my two Logic App. I decided to slip the process because many tasks needed to be done for every note. The main loop (on the left) fetches the notes collection and generates the output. For every note, it will call the Child Logic App (on the right).


logicapp-Overview

The ReadingNotes Builder contains:
  1. Initial Trigger when a file is created in an OneDrive folder.
  2. Create a copy of the file in an Azure Blob storage.
  3. Delete the file in OneDrive.
  4. Get the configuration file from Azure blob storage.
  5. Call the API App responsible for parsing the file and extract the recent clippings.
  6. Call the child Logic App for every clipping returned in the previous steps (foreach loop).
    • A. Trigger of the Child Logic App.
    • B. Call the API App responsible for searching in the online bookmark collection and return the URL of the article/ post.
    • C. Call Azure Function App responsible for extracting tags from the note.
    • D. Call Azure Function App responsible for converting the object in json.
    • E. Saves the json object in a file in Azure blob storage.
    • F. Gets the updated configuration file.
    • G. Call Azure Function App responsible for keeping the latest note date.
    • H. Update the configuration with the latest date.
    • I. Return an HTTP 200 code, so the parent Logic App knows the work is done.
  7. Call the API App responsible for building the final markdown file.
  8. Save the markdown file to an Azure blob storage that was returned at the previous step.
  9. Update the config final.

As an initial trigger, I decided to use OneDrive because it's available from any devices, from anywhere. I just need to drop the MyClippings.txt file in the folder, and the Logic App will start. Here, how I configured it in the editor:


logicappMain-InitialTrigger

MyClippings API App


Kindle's notes and highlights are saved in a local file named My Clippings.txt. To be able to manipulate my notes, I needed a parser. Ed Ryan created the excellent [KindleClippings][KindleClippings], an open-source project available on github. In the actual project, I only wrapped that .Net Parser in an Azure API App. Wrapping the API in an API App will help me later to manage security, metrics, and it will simplify the connection in all other Azure solutions app.

I won't go into the detail on how to create an Azure API App, a lot of great documentation is available online. However, you will need at least the Azure .Net SDK 2.8.1 to create an API App project.

CreateAPIApp_step1


Swagger API metadata and UI are a must so don't forget to un-comment that section in the SwaggerConfig file.


CreateAPIApp_step2


I needed an array of notes taken after a specific date. All the heavy work is done in [KindleClippings][KindleClippings] library. The ArrayKindleClippingsAfter method gets the content of the My Clippings.txt file (previously saved in Azure blob storage), and pass it to the KindleClippings.Parse method. Then using Linq, only return the notes taken after the last ReadingNotes publication date. All the code for this My Clippings API App is available on github, but here the method:


public Clipping[] ArrayKindleClippingsAfter(string containername, string filename, string StartDate)
{
    var blobStream = StorageHelper.GetStreamFromStorage(containername, filename);
    var clippings = KindleClippings.MyClippingsParser.Parse(blobStream);
    var result = (from c in clippings
                    where c.DateAdded >= DateTime.Parse(StartDate)
                    && c.ClippingType == ClippingTypeEnum.Note
                    select c as KindleClippings.Clipping).ToArray<Clipping>();
    return result;
}


Once the Azure API App is deployed in Azure it could be easily called from the Azure Logic App.


AddCustomAPIApp


A great thing to notice here is that because Azure Logic App is aware of the required parameters and remembers all the information from the previous steps (show in different color), it will be a very simple to configure the call.


logicappMain-GetTheClippings

Azure Logic App calling another Azure Logic App


Now that we have an array of note, we will be able to loop through each of them to execute other steps. When I wrote the App, the only possibility way to execute multiple steps in a loop, was to call another Azure Logic App. The child Logic App will by trigger by an HTTP POST and will return an HTTP Code 200 when it's done. A json schema is required to define the input parameters. A great tool to get that done easily is http://jsonschema.net.


logicappSub-InitialTrigger

Readability API App

Readability is an online bookmark service that I'm using for many years now. It offers many API to parse or search article and bookmarks. I found and great .Net wrapper on github called CSharp.Readability that was written by Scott Smith. I could have called directly the Readability from Logic App. However, I needed a little more so I decided to use Scott's version as my base for this API App.
From the clipping collection returned at the previous step, I only have the title and need to retrieve the URL. To do this, I added a recursive method call SearchArticle.

private BookmarkDetails SearchArticle(string Title, DateTime PublishDate, int Pass)
{
    var retryFactor = 2 * Pass;
    var fromDate = PublishDate.AddDays(-1 * retryFactor);
    var toDate = PublishDate.AddDays(retryFactor);
    var bookmarks = RealAPI.BookmarkOperations.GetAllBookmarksAsync(1, 50, "-date_added", "", fromDate, toDate).Result;
    var result = from b in bookmarks.Bookmarks
                    where b.Article.Title == Title
                    select b as BookmarkDetails;
    if (result.Count() > 0)
    {
        return result.First<BookmarkDetails>();
    }
    if (Pass <= 3)
    {
        return SearchArticle(Title, PublishDate, Pass + 1);
    }
    return null;
}

Azure Function: Extract tags

While most of the work was done in different API, I needed little different tools. Many possibilities but I decided to take advantage of the new Azure Function App. They just sit there waiting to be use! The ReadingNotes Builder uses three Azure Function App, let me share one of them: ExtractTags. An interesting part with function is that you can configure them to get triggered by some event or to act as Webhook.

CreateFunctionApp

To create a Function App as Webhook you can use one of the templates when you create the new Function. Or from the code editor in the Azure Portal, you can click on the Integrate tab and configure it.


SetWebHookonFunctionApp


Once it's done you are ready to write the code. Once again, this code is very simple. It started by validating the input. I'm expecting a Json with a node called note, then extract the tags from it and return both parts.


public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
    string tags;
    string cleanNote;
    string jsonContent = await req.Content.ReadAsStringAsync();
    dynamic data = JsonConvert.DeserializeObject(jsonContent);
    if (data.note == null ) {
        return req.CreateResponse(HttpStatusCode.BadRequest, new {
            error = "Please pass note properties in the input object"
        });
    }
    string note = $"{data.note}";
    int posTag = note.LastIndexOf('[')+1;
    if(posTag > 0){
        tags = note.Substring(posTag, note.Length - posTag-1);
        cleanNote = note.Substring(0,posTag-1);
    }
    else{
        tags = "";
        cleanNote = note;
    }
    return req.CreateResponse(HttpStatusCode.OK, new {
        tags = $"{tags}",
        cleanNote = $"{cleanNote}"
    });
}


Now, to call it from the Azure Logic App, You will need first to Show Azure Function in the same region to see your Function App. And simply add the input; in this case like expected a json with a node called note.


logicappSub-FunctionCall

What's next

Voila! This is a simple but real application. While I shared only a part of the code in this post, all of it is available on github. I also did a presentation at DevTeach and other details are present in those slides. Using Azure Logic App to build this application was really interesting, and easy. Now that some pieces are in place (up there), I will be able to grow my environment by adding more functionalities, an interface, more security, but that's for another post...



References

Reading Notes #244

cakeWin10Cloud


Programming


Miscellaneous



Create and Deploy .NET Core WebApp to Azure from Linux

(Ce billet en aussi disponible en français.)

The other day, I was glued to my PC, and I had spare time (yah, I know very unusual). Since .Net Core 1.0 was just released few days before, I decide to give it a try. To add an extra layer of fun in the mix, I decided to do it from my Ubuntu VM. In less than 15 minutes, everything was done! I was so impressed I knew I needed to talk about it. That's exactly what this post is about.

The preparation

Before we get started, it's important to know which version of Ubuntu you are using, because some commands will be slightly different. To know the version you are running you simply need to click the gear in the top right of the desktop screen and select About this Computer. In my case, since I'm using Ubuntu 14.04 LTS, I will be using command related to this version. If you are using a different version, please refer to .NET Core documentation.

ubuntu_version

Now we need to install .Net Core. Nothing more easy. Open a Terminal (Ctrl+Alt+T) and type those three commands:

# Setup the apt-get feed adding dotnet as repo
sudo sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet/ trusty main" > /etc/apt/sources.list.d/dotnetdev.list'
apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893

# Get the latest packages
apt-get update

# Install .NET Core SDK
sudo apt-get install dotnet-dev-1.0.0-preview2-003121
Once it's all done you can type dotnet --info and you should see something like that.

dotnet_info


Create the Local WebApp

From the same Terminal we will create an empty folder for our solution and move into it. Execute these commands.
mkdir demodotnetweb
cd demodotnetweb
We now want to create our new web project. This is done by the command dotnet new, but we need to specify the type otherwise it will create a console application.
dotnet new -t web
Now to download all the references (nuget packages) of our project required, execute the following command:
dotnet restore
Base on the speed of your Internet connection and how many dependencies are required, this can take from few seconds to more than one minute.
To test if our solution works locally type the command:
dotnet run
That will compile our solution and start an AspNetCore Internal hosting. Launch a web browser and go to http://localhost:5000 to see the App in action.

dotnetcore_localhost

Deploy to Azure

To deploy our new project to the cloud, we will use the continuous deployment capability of Azure. First, navigate to portal.azure.com and create a Web App.

create_webApp

Once the the application is created, you should see the This web app as been created message when you navigate to the [nameofWebApp].azurewebsites.net

successfully_created

It's now time to add a local Git repository. In the WebApp Settings select Deployment source. Click on the Configure Required Settings, then select the Local Git Repository option.

add_source_control

After setting the user credential for the repository, we can get the URL from the Essential section.

repourl

Back to our Ubuntu Terminal, we will have to execute these commands:

# create a git repository
git init
# commit all files
git commit -m "Init"

# Add the remote repository
git remote add azure https://username@demowebcore.scm.azurewebsites.net:443/demowebcore.git

# Push the code to the remote
git push azure master
After a minute or so you should have your WebApp online!

dotnetcore_azure


Voila! That was fun!.

Reading Notes #239

2016-07-04_05-54-55Cloud


Programming


Miscellaneous



6 ways to go from Markdown to Azure Web App

Everything started when I wanted to share a blog post in progress to someone for review. I didn't want to create a copy, and I was looking for an extremely simple way to share; like an url. This blog post is about all my journey to find that method and all the great possibilities available. I was really happy to that Azure Web App.
I'm writing in Markdown, it's a syntax I really like because it's simple no special application is required to use it. To know more about it see my previous post: Why I switch to Markdown, First VSCode Tasks in less than 5 minutes and Meet my new best friend: Visual Studio Code. The more I use it, the more I like it. I started using it not only not only for blogging, but also for all kinds of notes.

DropBox

One very good thing about Markdown is the fact that is compatible with all platforms. Because of that, I keep my texts in Dropbox. Why not Google Drive or OneDrive? Because Dropbox automatically generates the HTML version so my reviewer could read it in a beautiful format. The file is share-able very easily and if authenticated my reviewer could write comment.
dropboxpostsharing
In the PRO version, of DropBox, you can give access only to specific user. That would be very nice for sharing files inside a business or more sensible information.
Unfortunately for me, I don't want to force my reviewer to register. Another interesting fact is that relative paths for images aren't supported. So all images/ charts need also to be share individually before added in the text.

Repositories: GitHub, Bitbucket, etc.

By default, most repositories convert markdown file to HTML so very easy to read. It's also a very good way to have a saved copy. But then you need to have a public repository or give access to people...
Only using repository was not good enough in my case because I don't wish to share unfinished work with everyone.

Jekyll


Option 1 - Jekyll

Jekyll is a static website generator written in Ruby. It's really well integrated to Github, and you can even host your blog in a Github repository. However, since I would prefer to keep my in progress work more private, I decided to go with Bitbucket. Bitbucket is a great repository that supports Git and Mercurial system and allowed private repositories.
We could have Jekyll in a Git repository host on Bitbucket that would be hook-up to an Azure Web App with a continuous deployment.
Here the steps:
  1. First create a private repository from Bitbucket.
  2. Clone that fresh repository on your local machine.
  3. Now it's time to create your Jekyll site.
    • If you don't have Ruby or Jekyll already installed on your machine now it's time. It's very easy just follow the instruction on the official website.
    • To create a new site, open a command prompt and type the command: jekyll new NameOfMySite then cd ./NameOfMySite and jekyll serve
    To see your new site you just need to browse to http:localhost:4000. Add your Markdown files to the folder _posts and be sure they respect the naming convention YYYY-MM-DD-Title.md
  4. Now it's time to add all the files to our Git repository with the command git add -A, and before pushing let create a new Azure Web App.
  5. Go to http://portal.azure.com a create a new Azure Web App.
    CreateAzureWebApp
    • From the top left click the "+ New" button.
    • Select Web+ Mobile, then click on Web App
    • Fill-up the name, subscription plan and click the create button.
  6. After few second, the Web App will be ready. It's time to add a continuous deployment to it.
    AddContinuous
    Note: that right now the deployment settings are FTP.
    • In the Web App blade, if not already go in the Setting section.
    • Scroll done the Settings to Continuous deployment and click on it.
    • Now choose your source control, in this case Bitbucket.
  7. It's now time to publish our site to our Remote repository with git push.
  8. In the Azure portal, you will see the deployment progress and history.
    DeploymentHistory
The combination Jekyll / Bitbucket / Azure Web App work great, but we need to generate the code locally and checked-in both source and generated content in the repository. Furthermore, since we need to generate the code, Ruby and Jekyll need to be installed on every machine we will be using.

Option 2 - Jekyll Extension to Azure Web App

I found a really great Azure Web App Extension Jekyll Extenstion on GitHub. That will simplify a lot the process thanks to Cory's works. To use it simple follow the four steps explain on the Github page:

  1. Create an Azure Web App 
  2. Set an App Setting for SCM_COMMAND_IDLE_TIMEOUT to 600. From the Web App blade click on Settings and select Application settings.  Add the new line, and click the save button.
     SCM_AppSetting
  3. Install the Jekyll Site Extension
    • Always From the Web App blade click on Tools, then select Extension
    • Click Add button
    • Found and select Jekyll Extension AddJekyllExtension
  4. Now we need to hook up your Git repository or Push a local (in Azure) Git repository with your Jekyll site.
I really liked this solution. It's very simple to install. Because it used a repository, I can keep a historic of all my texts. Moreover, only the texts and images are in the repository, and since the site is generated in the cloud, no need to install anything on other machines.
WriteFromVSCode
Directly from Visual Studio Code, I can write my article, and when I'm read I just need to do a push (still inside VSCode). The site will automatically be built and deployed in my Azure Web App.

Santra.Snow

NancylogoWhile doing my research, I found Sandra.Snow another static site generator inspired from Jekyll but in .Net using Nancy library.
To use it, a little bit of work is required. The easiest way is to fork the Github project and compile the solution to get dlls and exes.
  • Create a new folder for your site [MySnowSite].
  • In MySnowSite folder, create another folder Sandra.Snow.Processor and copy/paste: Nancy.dll, Nancy.Testing.dll, Nancy.ViewEngines.Razor.dll and Snow.exe generated previously.
  • You can now copy the Sandra.Snow/SnowSite/Snow folder into MySnowSite folder.
  • Add deployment and deploy.cmd files from Sandra.Snow/SnowSite into MySnowSite folder.
SandraSnowFolder
Few changes were required in deploy.cmd (line: 29, 31, 56, 57)
@echo off

:: ----------------------
:: KUDU Deployment Script
:: ----------------------

:: Setup
:: -----

setlocal enabledelayedexpansion

SET ARTIFACTS=%~dp0%artifacts

IF NOT DEFINED DEPLOYMENT_SOURCE (
  SET DEPLOYMENT_SOURCE=%~dp0%.
)

IF NOT DEFINED DEPLOYMENT_TARGET (
  SET DEPLOYMENT_TARGET=%ARTIFACTS%\wwwroot
)

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:: Deployment
:: ----------

:: 3. Build Snow Site
echo -----
echo Start - Building the Snow Site
echo Running Snow.exe config=%DEPLOYMENT_SOURCE%\Snow\
pushd %DEPLOYMENT_SOURCE%
call  %DEPLOYMENT_SOURCE%\Sandra.Snow.Processor\Snow.exe config=%DEPLOYMENT_SOURCE%\Snow\
IF !ERRORLEVEL! NEQ 0 goto error
echo Finish - Building the Snow Site
echo -----


IF NOT DEFINED NEXT_MANIFEST_PATH (
  SET NEXT_MANIFEST_PATH=%ARTIFACTS%\manifest

  IF NOT DEFINED PREVIOUS_MANIFEST_PATH (
SET PREVIOUS_MANIFEST_PATH=%ARTIFACTS%\manifest
  )
)

IF NOT DEFINED KUDU_SYNC_COMMAND (
  :: Install kudu sync
  echo Installing Kudu Sync
  call npm install kudusync -g --silent
  IF !ERRORLEVEL! NEQ 0 goto error

  :: Locally just running "kuduSync" would also work
  SET KUDU_SYNC_COMMAND=node "%appdata%\npm\node_modules\kuduSync\bin\kuduSync"
)


echo Kudu Sync from "%DEPLOYMENT_SOURCE%\Snow\Website" to "%DEPLOYMENT_TARGET%"
call %KUDU_SYNC_COMMAND% -q -f "%DEPLOYMENT_SOURCE%\Snow\Website" -t "%DEPLOYMENT_TARGET%" -n "%NEXT_MANIFEST_PATH%" -p "%PREVIOUS_MANIFEST_PATH%" -i ".git;.deployment;deploy.cmd" 2>nul
IF !ERRORLEVEL! NEQ 0 goto error

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

goto end

:error
echo An error has occured during web site deployment.
exit /b 1

:end
echo Finished successfully.
Like previously created a Azure Web App and hook up a Git repository or push to an Azure one. You can find a lot of information on the blog of Sandra.Snow creator Phillip Haydon.

Bonus

For both Jekyll (option 2) and Sandra.Snow that used Azure Web App continuous deployment use can use Dropbox instead of Git repository. Why would you use Dropbox? Well, since Dropbox is available on any kind of platform, you would be able to write from your iPad or android tablet, or anything! To learn more about how to do it, see one of my previous post: Setup an automatic deployment on Azure with Dropbox in 5 minutes.
Just for the fun, I created one theme for Sandra.Snow that I put on GitHub: Sandra.Snow.NotesTheme, feel free to use it.

Enjoy!

~Frank Boucher