Showing posts with label static. Show all posts
Showing posts with label static. Show all posts

Reading Notes #478

It's Monday, time to share my reading notes. Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.

You think you may have interesting content, share it!


Cloud

Programming

Books


Making Work Visible: Exposing Time Theft to Optimize Work & flow

Author: Dominica Degrandis, Tonianne DeMaria

This book helps define and explains how to really do and use tools like Kanban. It's not a story like in The Phoenix Project. There is a lot in this book and it is worth taking your time to read it.

A book to share.


~Frank





Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline

Static websites are lightning fast, and running them inside an Azure Blob Storage instead of a WebApp is incredibly economical (less than $1/ month). Does it mean you need to do everything manually? Absolutely not! In a previous post I explained how to automatically generated your static website using a Build Pipeline inside Azure DevOps. In this post, let's complete the CI-CD by creating a Release Pipeline to deploy it.

The Azure Resource Manager (ARM) Template


First thing first. If we want our release pipeline to deploy our website in Azure, we need first to be sure our Resources are available "up there." The best way to do this is by using an Azure Resource Manager (ARM template). I will use the same project started in the previous post, feel free to adapt to your structure or copy it from it.

Create a new file named deploy.json in the deployment folder. We need a simple storage account.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "StorageName": {
            "type":"string",
            "defaultValue": "cloudenfrancaisv2",
            "maxLength": 24
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2018-07-01",
            "name": "[parameters('StorageName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "[parameters('StorageName')]"
            },
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "StorageV2"
        }
    ],
    "outputs": {}
}

I used a parameter (StorageName) to define the name of the storage account. This way I could have multiple pipelines deploying in different storages.

Not to make the ARM template accessible to the release pipeline we also need to publish it. The easiest way to do it is to add another Copyfile task in our azure-pipeline. Add this task just before the PublishBuildArtifacts.

- task: CopyFiles@2
displayName: 'Copy deployment content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/deployment'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/deployment
    cleanTargetFolder: true

Once you commit and push these changes, it will trigger a build. When done, the ARM template will be available, and we will be able to start working on the release pipeline.

The Release Pipeline


Navigate to the DevOps project created in the previous post. This time, create a new Release Pipeline. When asked, select an empty template, we will pick manually the tasks we need.

First, we need to define the trigger and where are our artifacts. Click on the thing at the left of the screen. Select the build projects and let's use the latest version of the artifact to our deployment.

To get a continuous deployment, you need to enable it by clicking on the lightning bolt and selecting the enabled button.

Now let's select our tasks. Click on the "+" sign to add new tasks. We need three of these: Azure Resource Group Deployment, Azure CLI, and Azure File Copy.



Task 1 - Azure Resource Group Deployment


The first one will be an Azure Resource Group Deployment. The will be used to deploy our ARM template and be sure that the resources are available in Azure.

To configure the ARM deployment we need to select the Azure subscription and authorize the pipeline to have access. Then you will need to specify the name of the resource group you will be deploying into; it's location and finally points where is the linked ARM template.


Task 2 - Azure CLI


The second one is an Azure CLI. As I am writing this post, it's not possible to enable the static website property of a storage account. Therefore we will execute an Azure CLI command to change that configuration. Once you picked the Azure subscription, select inline script and enter this Azure CLI command:

az storage blob service-properties update --account-name wyamfrankdemo --static-website  --index-document index.html

This will enable the static website property of the storage account named wyamfrankdemo, and set the default document to index.html.

Task 3 - Azure File Copy


The last task is an Azure File Copy to copy all our files from $(System.DefaultWorkingDirectory)/drop/drop/outpout to the $web container (in our Azure Blob storage). The container must be named $web, that's the name used by Azure for the static website.

Wrapping up


Once you are done configuring the Release Pipeline, it's time to save it and run it. After only a minute or two (this demo is pretty small) the blog should be available into Azure. To find your endpoint (aka URL) you can go into the portal.azure.com and look at the static website property of the blob storage that we just create.

In a video, please!


I also have a video of this post if you prefer.





How to use Azure DevOps Pipeline and Cake to generate a static website

I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.

In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.

The Goal


The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.

The Static Website


In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.

For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.

output/

tool/
tools/

wwwroot/

config.wyam.dll
config.wyam.hash
config.wyam.packages.xml

Build Pipeline


The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.

The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.

On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:

#tool nuget:?package=Wyam&version=2.2.0
#addin nuget:?package=Cake.Wyam&version=2.1.3

var target = Argument("target", "Build");

Task("Build")
    .Does(() =>
    {
        Wyam( new WyamSettings {
            Recipe = "Blog",
            Theme = "CleanBlog"
        });        
    });

Task("Preview")
    .Does(() =>
    {
        Wyam(new WyamSettings
        {
            Recipe = "Blog",
            Theme = "CleanBlog",
            Preview = true,
            Watch = true
        });        
    });

RunTarget(target);

On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.

Now let's have a look at the azure-pipeline.yml file.

trigger:
- master

variables:
DOTNET_SDK_VERSION: '2.1.401'

pool:
vmImage: 'VS2017-Win2016'

steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core SDK $(DOTNET_SDK_VERSION)'
inputs:
    version: '$(DOTNET_SDK_VERSION)'

- powershell: ./build.ps1
displayName: 'Execute Cake PowerShell Bootstrapper'

- task: CopyFiles@2
displayName: 'Copy generated content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/output'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/outpout
    cleanTargetFolder: true

- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.

The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).

Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.

There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.

Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.

To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.



It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.

(Next post in the series - Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline)


In a video, please!

I also have a video of this post if you prefer.






References

Reading Notes #362



raquetteSuggestion of the week



Cloud



Programming



Miscellaneous



~

How create a static website on Azure Storage

I have been waiting for this feature for so long! I know; it's not a major feature, but it fills an important gap in the Azure offer. We can now create static websites in the Azure Blob Storage (as I'm writing this post the service is still in preview). In this post, I will explain why I think it's a really good news, show how to create and publish on a static website.

Why It's an Awesome News


The cloud is the perfect place when you need to build something huge very quickly. It's also an excellent solution when you have a lot of variance in the number of resources it required. Because Azure is a service, it will provide you as many resources as you would like in few minutes. And when you are done with the resources you stop paying for them; and it's really great like that!
However, if the only thing you need was to host a little something like a blog or a little website for an event or some temporary publicity Azure was not the best place for it. I mean yes of course, you could build a service and host many little websites on it (Scott Hanselman as excellent posts about that like this one), but it felt always a bit overkill for most of the users. Some people kept an "old style" host provider just for that. I mean it's fine, it works... But with Azure storage, it will be really reliable, and at a lower cost! Let's see how we can create one.

Create a Static Website


To have the static website feature you need to create an Azure Blob Storage account the same way you created them before, however, it needs to be of kind General Purpose V2 (GPV2). Today if you install the Azure CLI Storage-extension Preview, you can use it to create one, or simply go on the portal.azure.com. Let's use the portal since it's more visual.

createStorage
Once the storage is created, open it. On the left menu of the storage blade, click on the Static website (preview) option. That will open the configuration page for our static website. First, click the Enabled button then enter the initial/ index document name (ex:index.html). Finally, click the Save button on the top of the blade.

ConfigureStatic
The shell for our website is now created. A new Azure Blob Storage container named $web h been created. The Primary and secondary endpoint should now be displayed (ex: https://frankdemo.z13.web.core.windows.net/). If you test this URL, you will see and message saying that the content doesn't exist... and it's normal.

emptywebsite

Create some content


This is the part where it all depends on your needs. You may already have some HTML pages ready, or you may want to code them all yourself, or the website may previously exist. For this post, I will create a brand-new blog using a static website generator named Wyam (if you would like to see how to do it with Jekyll, another generator, I used it in the video)
To create a new template with Wyam you use the following command in a command prompt. That will create a new website in the subfolder output.
wyam --recipe Blog --theme CleanBlog

Publish to Azure


It's now time to upload our content to the Azure blob Storage. The easiest is probably directly from the portal. To upload a file, click on the $web container, then the Upload button. From the new form, select the file and upload it.

portalUpload
The main problem with this method is the that it only works one file at the time... And a website usually has many of those...
A more efficient way would be to use Azure Explorer or some script. Azure Explorer doesn't support yet the Azure Storage Static Website, but it will be soon. So that leads us to scripts or command lines.

AzCopy


I really like AZCopy as it's very efficient and easy to use. Unfortunately, as I'm writing this post, AzCopy doesn't support the Azure Storage Static Website. I try to upload all content from the output folder (and sub folders)) with a command like this, but it fails.
azcopy --source ./output --destination https://frankdemo.blob.core.windows.net/$web --dest-key fec1acb473aa47cba3aa77fa6ca0c5fdfec1acb473aa47cba3aa77fa6ca0c5fd== --recursive

Azure CLI


An Azure CLI extension preview is also available. Like I mentioned previously, the extension gives you the possibility to create a static website or update the configuration, to upload files you have two options the batch would be more efficient of course, but the file by file option also works. Thanks to Carl-Hugo (@CarlHugoM) for your help with those commands.


az storage blob upload-batch -s "./output" -d $"web" --account-key fec1acb473aa47cba3aa77fa6ca0c5fdfec1acb473aa47cba3aa77fa6ca0c5fd== --account-name frankdemo

az storage blob upload -f "./output/index.html" -c $"web" -n index.html ---account-key fec1acb473aa47cba3aa77fa6ca0c5fdfec1acb473aa47cba3aa77fa6ca0c5fd== --account-name frankdemo

Visual Studio Code Azure Storage Extension

I finally tried the Visual Studio Code Stogare Extension. After installing it, you need to add a User Setting Ctrl + ,. Then add "azureStorage.preview.staticWebsites" : true to your configuration. Now you just need to click on the extension, then select Azure blob storage from your subscription, and right click to be able to upload a folder.

vscodeupload
Depending on how many files, and their sizes it will take a moment. VSCode will notify you when it's done. You will then be able to get back online and refresh your website to see the result.

website

Conclusion


I'm very happy to see that feature because it fills a need that was not really cover yet by the Microsoft offer. Right now, it's an early preview so even if the service is very stable, not all the tools support it but that only temporary. Right not you can set your custom domain name, however, HTTPS is not supported.
So what do we do with it? Should we wait or jump right on? Well as the best practices imply when a feature is in preview don't put your core business on it yet. If you are just looking to build a personal website, a little promo than... enjoy!

In video, please!


I also have a video of this post if you prefer.




References