Showing posts with label post. Show all posts
Showing posts with label post. Show all posts

Upgrade an Application Windows Azure OS Family

Recently I add to upgrade an web site running in Azure Webrole from Azure OS famille 1.6 to a more recent version. While the migration was not complicated I encounter some little particularity that I found could be interesting to share.

The Context

The website was a Visual Studio 2010 project using Azure SDK 1.6 and a library call AspNetProvider that was part of Microsoft's sample few years ago to manage session and membership. Using the AspNetProvider library the session was saved in Azure blob storage, and the membership was saved in an SQL database.

The Goal

The application must stay a Visual Studio 2010 project, but using the most-recent Azure SDK and Azure Storage Client as possible.

The Solution

  • Azure SDK 2.1
  • Azure.StorageClient 4.0
  • Universal Provider version 2.1
  • OS famille 4

The Journey


Migration from SDK 1.6 to SDK 2.1


Azure SDK version 2.1 is the higher version compatible with Visual Studio 2010. And can be downloaded from Microsoft's website. Once it is installed, just open the project in Visual Studio and right-click on the Azure Project. By clicking on the upgrade button the magic will happen. Some errors could stay but the hard work will be done for you.


Migration from AspNetProvider to UniversalProvider


we need to remove all reference to the AspNetProvider library. Just expand the resources node in the Solution Explorer and delete the reference. One thing important is that since we are using Visual Studio 2010 the latest version of the UniversalProvider we can use is 1.2. More recent version are using .Net 4.5 and this is not compatible with the present solution. To get the reference added to the project just execute the following Nugget command:
Install-Package UniversalProvider -version 1.2

Check the web.config file to clean the membership connections.

Migration of the Azure Storage Client


This one is the easiest, just remove the reference in the reference node and then execute the following Nugget Command:
Install-Package Azure.Storage.Client

Migration of the membership data


The AspNetProvider was using prefixed SQL tables: aspnet_user, aspnet_membership, etc. The new membership manager is using another sets of tables. We must migrate the data from one set to the other one. Here a SQL script that will to exactly that. The script can be run multiple times because it will only copie the unmoved data.
-- ========================================================
-- Description:    Migrate data from asp_* tables 
--                 to the new table used by Universal provider
-- ========================================================

DECLARE @CNT_NewTable AS INT
DECLARE @CNT_OldTable AS INT

-- --------------------------------------------------------
-- Applications -------------------------------------------

INSERT INTO dbo.Applications (ApplicationName, ApplicationId, Description)
    SELECT    n.ApplicationName, n.ApplicationId, n.Description 
    FROM    dbo.aspnet_Applications o 
    LEFT    JOIN dbo.Applications n ON o.ApplicationId = n.ApplicationId
    WHERE    n.ApplicationId IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Applications 
SELECT @CNT_OldTable = Count(1) from aspnet_Applications

PRINT 'Application Count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' = ' + CAST(@CNT_OldTable AS VARCHAR)

-- -------------------------------------------------------- 
-- Roles --------------------------------------------------

INSERT INTO dbo.Roles (ApplicationId, RoleId, RoleName, Description)
SELECT    o.ApplicationId, o.RoleId, o.RoleName, o.Description 
FROM    dbo.aspnet_Roles o
LEFT JOIN dbo.Roles n ON o.RoleId = n.RoleId
WHERE n.RoleId IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Roles 
SELECT @CNT_OldTable = Count(1) from aspnet_Roles

PRINT 'Roles Count : ' + CAST(@CNT_NewTable AS VARCHAR) + ' = ' + CAST(@CNT_OldTable AS VARCHAR)

-- --------------------------------------------------------
-- Users --------------------------------------------------

INSERT INTO dbo.Users (ApplicationId, UserId, UserName, IsAnonymous, LastActivityDate)
SELECT o.ApplicationId, o.UserId, o.UserName, o.IsAnonymous, o.LastActivityDate 
FROM dbo.aspnet_Users o LEFT JOIN dbo.Users n ON o.UserId = n.UserID 
WHERE n.UserID IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Users 
SELECT @CNT_OldTable = Count(1) from aspnet_Users

PRINT 'Users count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)

-- --------------------------------------------------------
-- Memberships --------------------------------------------

INSERT INTO dbo.Memberships (ApplicationId, UserId, Password, 
PasswordFormat, PasswordSalt, Email, PasswordQuestion, PasswordAnswer, 
IsApproved, IsLockedOut, CreateDate, LastLoginDate, LastPasswordChangedDate, 
LastLockoutDate, FailedPasswordAttemptCount, 
FailedPasswordAttemptWindowStart, FailedPasswordAnswerAttemptCount, 
FailedPasswordAnswerAttemptWindowsStart, Comment) 

SELECT o.ApplicationId, o.UserId, o.Password, 
o.PasswordFormat, o.PasswordSalt, o.Email, o.PasswordQuestion, o.PasswordAnswer, 
o.IsApproved, o.IsLockedOut, o.CreateDate, o.LastLoginDate, o.LastPasswordChangedDate, 
o.LastLockoutDate, o.FailedPasswordAttemptCount, 
o.FailedPasswordAttemptWindowStart, o.FailedPasswordAnswerAttemptCount, 
o.FailedPasswordAnswerAttemptWindowStart, o.Comment 
FROM dbo.aspnet_Membership o
LEFT JOIN Memberships n ON  o.ApplicationId = n.ApplicationId
                      AND o.UserId = n.UserId
WHERE n.UserId IS NULL AND n.ApplicationId IS NULL


SELECT @CNT_NewTable = Count(1) from dbo.Memberships 
SELECT @CNT_OldTable = Count(1) from aspnet_Membership

PRINT 'Memberships count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)

-- -------------------------------------------------------
-- UsersInRoles ------------------------------------------
TRUNCATE TABLE dbo.UsersInRoles
INSERT INTO dbo.UsersInRoles SELECT * FROM dbo.aspnet_UsersInRoles


SELECT @CNT_NewTable = Count(1) from dbo.UsersInRoles 
SELECT @CNT_OldTable = Count(1) from aspnet_UsersInRoles

PRINT 'UsersInRoles count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)


Migration from OSFamilly 1 to 4

Open the file .cscfg and edit the OS Family attribute. It's in the ServiceConfiguration node.
<ServiceConfiguration servicename="MyApp" osFamily="4" osVersion="*" ...>    


Wrapping up

The only step left is to deploy in the staging environment to see if everything is working as expected. would recommend also to plan to upgrade as soon as possible because the Azure SDK 2.1 official retirement date is November 2015. I hope this post could help you, even if you are migrating from and to a different version. Any comments, suggestions and/or questions are welcome.


~ Frank Boucher


The making of: Franky's Notes Azure Search - part 2

This post concludes The making of: Franky’s Notes Azure Search. In the previous post, I build a console application in .Net using the RedDog.Search library, to populate an index in my Azure Search Service with my notes.
In this post, I’m sharing with you how I created the user interface to query my notes. To know more about the Azure Search REST API, all the documentation is available online.

Objectives


For this part of the project, we will use the azure-search javascript client of Richard Astbury available on Github. The idea is to build a nice user interface (UI) that will provide a simple and efficient way to search. Since the code will be in JavaScript, it’s strongly suggested to use a query key instead of a master key. These keys can be managed from the Azure Portal.

Azure Search Query Keys

Creating the Interface


First, we need to get the azure-search. To get it, you can whether download the file azure-search.min.js on Github or by execute npm install azure-search from a Node.js console.
Now we need a simple HTML page with a form, a textbox and a button.
    <html>
        <head>
            <title>Search</title>
            <link  href="css/bootstrap.min.css" rel="stylesheet">
            <!--[if lt IE 9]>
                <script src="scripts/html5shiv.min.js"></script>
                <script src="scripts/respond.min.js"></script>
            <![endif]-->
        </head>
        <body>
            <form>
                <label>Search</label>
                <input id="txtSearch" placeholder="Search">
                <button id="btnSearch" type="button">Search</button>
            </form>

            <div id="result"></div>

            <script src="scripts/jquery.min.js"></script>
            <script src="scripts/bootstrap.min.js"></script>
            <script src="scripts/azure-search.min.js"></script>
            <script>

                var client = AzureSearch({
                  url: "https://frankysnotes.search.windows.net",
                  key:"DB7B9D1C53EC08932D8A8D5A1406D8CA" // - query only
                });

            </script>
        </body>
    </html> 
As you can see I’m creating the AzureSearch client using my query key from before. Afterwards, we create a Search function to retrieve the search criteria from the textbox and pass it to the client. A dynamic function is used as a callback that receives the parameter noteList which is an array of matching documents. We finally just need to loop through the result to build a nice output.
    function Search(){

        var _searchCriteria = $("#txtSearch").val();   
        var _objSearch = {search: _searchCriteria, $orderby:'title desc'};

        client.search('notes', _objSearch, function(err, noteList){
            var $divResult = $("div#result");
            $divResult.html( "<div class='panel-heading'><h3 class='panel-title'>" + noteList.length + " Result(s)</h3></div><div class='panel-body'>" );

            if(noteList.length > 0){

                var _strResult = "";
                _strResult = "<ul class='list-group'>";

                for(var key in noteList){
                    var fNote = noteList[key];

                    _strResult += = "<li class='list-group-item'><a href='" + fNote.url + "' target='_blank'>" + fNote.title + "</a><p>" + fNote.note + "</p></li>";
                }

                _strResult += "</ul></div>";
                $divResult.append( _strResult );
            }
      });
    }

If we put all this together, we got a nice result.

Franky's Notes Search UI

Live Demo

Conclusion


I really had a lot of fun creating these little applications. I found the client incredibly easy to use. I hope it will help you to get ideas and moving forward to use Azure Search. Any comments, suggestions and/or questions are welcome.


~ Frank Boucher


References


The making of: Franky's Notes Azure Search - part 1


For a long time now, I'm thinking about creating an API that will allow to search easily through my notes. When Azure Search came public few weeks ago, I knew it was what this project needed to come alive. In this post, I will share how I did it, and more importantly, show how incredibly easy it was to do.


What's Azure Search?


Currently in preview, Azure Search is a cloud-based search-as-a-service that provides a set of REST APIs defined in terms of HTTP requests and responses, in OData JSON format.

Getting Started


From the Azure Portal, let's create an Azure Search Service by clicking the plus button on the bottom left of the screen. Select the Search option, and fill-up the options.

Azure_portal_crete_Search_Service_2014-10-20_0931

Application to populate my Azure Search service


First, we will need some data. My weekly posts Reading Notes are generated with a Ruby script that I did few years ago. You can read more about it on First step with Ruby: Kindle Clipping Extractor. Basically, the script extracts my notes from my Kindle and build a collection of notes grouped in different categories to generate a markdown file. That can easily be done by adding a new Json output file. Here is a quick view this output.
{
  "json_class": "FrankyNotes",
  "categories": {
    "dev": [
      {
        "id": 77077357,
        "title": "Customize the MVC 5 Application Users’ using ASP.Net Identity 2.0",
        "author": "Dhananjay kumar",
        "url": "http://debugmode.net/2014/10/01/customize-the-mvc-5-application-users-using-asp-net-identity-2-0/",
        "note": "Need to get the fukk article",
        "tags": "dev,frankysnotes,readingnotes160",
        "date": "2014/10/17",
        "category": "dev"
      },
      {
        "id": 77156372,
        "title": "Custom Login Scopes, Single Sign-On, new ASP.NET Web API – updates to 
      [...]

Now that we have some data, we need to create an index and be able to add document in it. A console application will be perfect for this job. At the time of writing this post, two libraries exist to interact with the Microsoft Azure Search REST API. For this part of the project, we will use the RedDog.Search library available on Github, since it's a .Net library.

Note: To create an index or upload documents you will need an admin key.

Admin_Key

First, we need to create an Index. Let's keep it simple and just create the index with all the properties of the json object. Here the code of my function CreateNoteIndex.
public IndexManagementClient Client
{
    get
    {
        if (_client == null){
            _client = new IndexManagementClient(ApiConnection.Create("frankysnotes", "AdminKey"));
        }
        return _client;
    }
}

public async Task<string> CreateNoteIndex()
{
    var createResult = await Client.CreateIndexAsync(new Index("notes")
        .WithStringField("id", opt => opt.IsKey().IsRetrievable())
        .WithStringField("title", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("author", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("url", opt => opt.IsRetrievable().IsSearchable(false))
        .WithStringField("note", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("tags", opt => opt.IsRetrievable().IsFilterable().IsSearchable())
        .WithStringField("date", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("category", opt => opt.IsRetrievable().IsFilterable().IsSearchable())
        );
    if (createResult.IsSuccess)
    {
        return "Index Reseted successfully";
    }
}

To be able to search by note instead of by post, I decided to break down the file in multiple documents containing one note by document. After what, it was really easy to upload the documents into the index.
public async Task<string> AddNotes(string filepath)
{
    var docs = new List<IndexOperation>();
    FrankysNotes notes = DeserializeFNotes(filepath);

    foreach (var category in notes.categories)
    {
        foreach (var fNote in notes.categories[category])
        {
            var doc = ConvertfNote(fNote);
            docs.Add(doc);
        }
    }

    var result = await Client.PopulateAsync("notes", docs.ToArray<IndexOperation>());

    return "File uploaded successfully";
}


private FrankysNotes DeserializeFNotes(string filepath)
{
    var jsonStr = File.ReadAllText(filepath);
    var serializer = new JavaScriptSerializer();

    var notes = serializer.Deserialize<FrankysNotes>(jsonStr);
    return notes;
}

private IndexOperation ConvertfNote(FrankysNote fnote)
{
    var doc = new IndexOperation(IndexOperationType.Upload, "id", fnote.id)
                    .WithProperty("title", fnote.title)
                    .WithProperty("author", fnote.author)
                    .WithProperty("url", fnote.url)
                    .WithProperty("note", fnote.note)
                    .WithProperty("tags", fnote.tags)
                    .WithProperty("date", fnote.date)
                    .WithProperty("category", fnote.category);
    return doc;
}

To keep the code as clear as possible, I removed all validations and error management. The json file is deserialized, then looping through all notes I build a list of IndexOperation. And Finally I upload all the notes with Client.PopulateAsync("notes", docs.ToArray<IndexOperation>());

Wrapping up


Using the RedDog.Search library to push documents in Azure Search Index was extremely easy. In fact, it's that simplicity that pushed me to share my discovery. In the next part of the series, I will create a simple HTML page to do real query.

Stay tune...

~ Frank Boucher

References

Why I switch to Markdown

The Markdown is not a new video game, but a way to write in plain text that can easily be converted in HTML. This "new" standard is gaining in popularity for many reasons. In this post, I will explain why I like it and show you some basic syntax, and nice tools.

What is Markdown

The exact Markdown's definition can by found on the Markdown website and look like this:
Markdown is a text-to-HTML conversion tool for web writers. Markdown allows you to write using an easy-to-read, easy-to-write plain text format, then convert it to structurally valid HTML.
The syntax in Markdown is very easy to learn. In fact, it will come mostly by itself since it will look nice in any text editor.  For example, title and subtitle and list look like:

Title and sub-title
This Is My Title
================

Subtitle 1
----------

Here some items:
- Item 1
- item 2
- item 3

That easy right?! Let's add two more sample, this time a bit more "complex": Links and images.

Link

For links, two styles are possible:

Inline:

[Link Text](http://www.frankysnotes.com)
Reference:

[Link Text][1]
[Another one][link2]

...

[1]: http://www.frankysnotes.com 
[link2]: http://www.frankysnotes.com    

I personally prefer the reference style, because it keeps the text clean and easy to read. To add an image it's mostly the same two style again, but we add an exclamation point in front of the square brackets.

Image
![alternative text](http://frankysnotes.com/images/logo.png)

![alternative text][logo]

...

[logo]:http://frankysnotes.com/images/logo.png

This these simple things cover mostly everything we need when writing documentation, blog posts or reference documents. Obviously if you need more you can always go on the Markdown website. I also put online this full article in Markdown format.


Why Markdown is so nice

First, I really like Mardown because I can edit my files on all platforms. Since they are regular text files, any text editor on Android, IOS, Windows Phone, PC and Linux will do the job perfectly. Plus, your text will never lose is formatting, while changing from a device to another one (like with Word documents).
Likewise, since I'm working on different devices, I usually put my file in a shareable place like Dropbox or OneDrive. A simple text file is very small and quick to synchronize.

Tools, apps and more

Yes, you can edit your file in any text editor, but here are some nice tools that are available that will improve your experience.

MarkdownPad

MarkdownPad is a full-featured Markdown editor for Windows. It`s available in a free and pro version. Some interesting features are:
  • Instant HTML Preview
  • Easy Markdown formatting with keyboard shortcuts
  • Spell check
  • Use your own CSS
  • HTML and PDF Export
Website: http://markdownpad.com/


Denote
Denote is a Markdown text editor for Android that provides effortless syncing with Dropbox. Files created with Denote are saved as text (.txt) files.
  • Live preview for Markdown and HTML
  • Cloud based: Denote stores all its data in a subfolder on your personal Dropbox account so you can access it via your Mobile devices, Mac or PC
  • Offline support: changes are synced with Dropbox next time you're connected
  • Email files created in Denote
  • Customize font size and type face used for notes
Website: http://www.2storks.com/denote


Atom
This great text editor from GitHub has a nice Markdown Preview package, that will convert the markdow in HTML.
Atom Website: https://atom.io/
Markdown Preview package: https://github.com/atom/markdown-preview


Sublime Text
Sublime text is a well knowed text editor in the developer community and many different packages are also available.
Sublime Text Website: https://atom.io/
Markdown Preview package: https://github.com/revolunet/sublimetext-markdown-preview

In conclusion

I hope this post will motivate you to give it a try. Thanks for reading. Any comments, suggestions and/or questions are welcome.


~Frank Boucher


Let's play with Azure SQL Database backup and the Point in Time Restore

Did you know that you can have: a full database backup once a week, a differential database backups once a day, and a transaction log backups every 5 minutes of your Azure SQL Database? Did you know that all this is done automatically when you are using the new Azure SQL Basic, Standard or Premier service tiers? Even more, you will have access to the Point in Time Restore self-service. In this post, I will show how to "configure" the database to get the automatic backups, and how to do a restore.

Setup the automatic backup


If you are like me, your Azure SQL Databases are set to Web or Business edition.  The first thing to do will be to change that. You will need to do it anyway since the Web and Business service tiers will be retired in September 2015[1].  That the only required since backup service are built-in the Basic, Standard and Premium tiers.
For the demo purpose, I will use the Basic tier. To change the tier of the database, go on the Azure Portal. In the left panel click on the SQL Databases et select the database that you want to update (ie: FrankDemo). Once the right section is updated, select the Scale tab and change the Service Tiers for: Basic.

Tiers_Basic

Now, if you return in the Dashboard tab, a new option will be available.

Restore_button

Your database now has built-in backups to support self-service Point in Time Restore and Geo-Restore. Azure SQL Database automatically creates backups using the following schedule:
  • Full database backup once a week
  • differential database backups once a day
  • transaction log backups every 5 minutes.
The full and differential backups are replicated across regions.

The retention period will vary between 7 and 35 days base on the selected tiers.[2]

Restore an Azure SQL Database


Restoring a database is really easy. Remember that new button at the bottom of the screen, it's now time to click on it.


Restore_setings

This will bring the settings options. It's now time to type the name of your restored database. Note that you must use a different name than the original.  It's always a good practice to double-check that you are pointing on the good database on the correct server. Pick a restore point using the slider or by filling the date and time fields. When you are done click the button. The portal will let you know that the restore is successfully completed by a notification.


Restore_done

I hope that this post shows you how easy it is to use the backup/restore with Azure SQL Database. Thank you for ready, Any comments, suggestions and/or questions are welcome.



[1]: Web and Business service tiers will be retired in September 2015, more
details on the Windows Azure site.
[2]: Detail about the retention period on Azure SQL Database Backup and Restore



~ Frank Boucher




Setup an automatic deployment on Azure with Dropbox in 5 minutes

(this post is also available in French)

This post is about creating an automatic deployment that could be used by everyone. I picked Dropbox as source control because today mostly everyone got is account. If you need one, feel free to use this invite it will gives you 500 MB of bonus space for free!

Step 1: Configure the automatic deployment

To configure the deployment, connect to the Azure management portal. Although the new portal is my favourite to manage and visualize information on websites, as I write this post the features needed for the Dropbox deployment were not yet available. We must connect to "old" portal and select the Web site. You a website is not already created you can add one using the quick create.
After selecting the site, you need to click on the option: Set the deployment from source control, which is located at the bottom right of the dashboard code.


Step 1

From the dropdown list, choose Dropbox and click the arrow. Microsoft Azure deployment will now aks you to have to access on a directory in your Dropbox account.

Step 2


Step 2: Publish Web Site

From your computer, access Dropbox. If you left the default settings, the directory should be under Apps / Azure / [dirname]. You can now copy the code, images and all other files that you need. After synchronization with DroxBox completed (the small green checks everywhere) you can return to Azure portal.

It is now time to deploy. To do this you need to click Sync.


Step 5

Once completed you'll get a message informing you that the deployment is done. You can now check the log to see the deployment steps in detail if you wish.

Step 7

The new version of your website is now available!


Conclusion

Deploy a blog, a static business site, a family owned site with Dropbox is so simple! It`s even better than the good old FTP, if something goes wrong, you can redeploy by one click.





References


~Frank


Quick Tips:How to save money with your VM

(This post was originally published on Matricis Blog)
(Version anglaise de: Astuces rapides pour économiser avec les machines virtuelles Windows Azure)

At the office, we use a lot of virtual machines for our development and tests. It's really useful to be able to pop any number of VM in just few minutes and dispose of them once the job is completed. However, sometimes we need them for a long period, wouldn't it be great if we could save money? In this post, I will show you how you can achieve that.

Tip #1

The first tip is to shut down the VM when you don't need them. Why should you pay for it 24/7 if you only use them from 9 to 5, five days a week?! The common way to do it is by log-in into the Windows Azure management portal and to go in the virtual machine list. Select one by one the VM you want to close and click the shutdown button in the center of the bottom screen.
If you have many VMs, this task could become very boring. A PowerShell script is the perfect tool for that. Here are two scripts to start of stop one virtual machine.

Script to start the VM

#Full path of the publish Setting file downloaded from Azure.
$NameOfSettingFile = ".\MySettings.publishsettings"

# The name of the machine to get started
$VMName = "dev2"

# Azure Subscription Name
$SubscriptionName = "Frank Dev"

cls

if(!(Test-Path $NameOfSettingFile)){
    echo "Download the Publishing Setting files, and re-run the script."
    Get-AzurePublishSettingsFile
    exit
}

Import-AzurePublishSettingsFile  $NameOfSettingFile

Select-AzureSubscription -SubscriptionName $SubscriptionName

Write-Host "Starting the VM $VMName" -ForegroundColor Cyan
Start-AzureVM -Name $VMName -ServiceName $VMName

do{
    Start-Sleep –Seconds 2
    $vmInfo = Get-AzureVM -ServiceName $VMName -Name $VMName
    $vmStatus = $vmInfo.InstanceStatus
    Write-Host "The VM $VMName is still $vmStatus..."
}
while($vmStatus -ne "ReadyRole")

Write-Host "The VM $VMName is now accessible by Remote Connection." -ForegroundColor Green

read-host "Press enter key to close"

Script to stop the VM

#Full path of the publish Setting file downloaded from Azure.
$NameOfSettingFile = ".\MySettings.publishsettings"

# The name of the machine to get started
$VMName = "dev2"

# Azure Subscription Name
$SubscriptionName = "Frank Dev"

cls

if(!(Test-Path $NameOfSettingFile)){
    echo "Download the Publishing Setting files, and re-run the script."
    Get-AzurePublishSettingsFile
    exit
}

Import-AzurePublishSettingsFile  $NameOfSettingFile

Select-AzureSubscription -SubscriptionName $SubscriptionName

Write-Host "Stop the VM $VMName" -ForegroundColor Cyan
Stop-AzureVM -Name $VMName -ServiceName $VMName -Force

Write-Host "The VM $VMName is now shuting down." -ForegroundColor Green
read-host "Press enter key to close"

Tip #2


The second tip could be applied only on VM that don't need to scale or to be under a load balancer. A developer's machine is the perfect case for that. We need to change the tier of the VM. That a new feature since (put the date of the release here), so all your VM should be presently to standard.

Quick_tip2_en

We need to change the tier for basic. With this change apply the Billing rate will change. You could go on the pricing page to see more detail, but to give you an idea on a large VM this represent 40$. Interested? Here's how to change this setting. First, if you are not already, connect to the Windows Azure management portal. Then in the list of the virtual machine select the VM you want to change. Click on the Setting tab then change the tier. If the VM is running it will need a reboot, so save any work before.

I hope this quick tips could help you. Let me know if you have other idea and I will add them.


~Frank




How to be sure that your PowerShell script are doing what you think?

Recently, I was required to write PowerShell script to automate some Microsoft Azure deployments. I was at first really happy to share those scripts because I knew other developers would use it. They would add functionalities and the library would grow. The idea was nice, but how to be sure that while adding some functionalities they don’t break something else? I didn’t know any unit test framework for Powershell, so I decided to do a quick search online. I was sure some kind of framework already exists. What I didn’t expect, however, was to find a framework that was really easy to use, complete and free! In this post, I will introduce you to Pester, a wonderful PowerShell framework available on github.

What is Pester?

Pester is a Business Development Driven (BDD) unit tests framework that implements a lot of functionalities like mock and exceptions management.

How to Install

The framework is available on github.com but it can also be downloaded from different repositories. My personal favorite is to download it from Chocolatey because it’s the most painless method. The Chocolatey’s command to install Pester is:
cinst pester 

Before continuing, if you try a Pester command right now, you will probably have the error message: “The term ‘___’ is not recognized as the name of a cmdlet…”. One way to fix that is to add Pester to your Profile. This can be done by executing the following command.

{Import-Module "C:\Chocolatey\lib\pester.2.0.4\tools\Pester.psm1"} | New-Item -path $profile -type file -force;. $profile

Get Started

Now that Pester is available on your machine, let’s start by creating our first test. Pester can scaffold for you an empty script file and a test file. If you add Pester to an already existing library, don’t be scared, Pester will not override the existing files.

Open a PowerShell console and type the following command:

New-Fixture c:\dev\pesterDemo pesterDemocd c:\dev\pesterDemo

You can try the new test by invoking Pester with this command:

Invoke-Pester

ResultDefaultPesterTest

As you can see a test already exists, and it failed.

Simple test

Let’s create a reel test that will read an XML file to extract a property. Here is the content of the XML file and the code for both the script and the test file.

manifest.xml
<service>
    <name>Employee Provider</name>
    <version>1.2</version>
</service>

pesterDemo.Tests.ps1
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path).Replace(".Tests.", ".")
. "$here\$sut"

Describe "#pesterDemo#" {

    Context "Test with reel XML document."{

        It "Loads a reel manifest." {
            $xml = getManifest ".\manifest.xml"
                $xml.OuterXMl | Should Be "<service><name>Employee Provider</name><version>1.2</version></service>"
        }

        It "Return the service name."{
            $sName = GetServiceName ".\manifest.xml"
                $sName | Should Be "Employee Provider"
        }

    }
}

pesterDemo.ps1
function GetManifest($manifestPath){
        if(Test-Path $manifestPath)
        {
            [xml]$xml = Get-Content $manifestPath
            return $xml
    }
    else{
        Throw "Error. Ïnvalid Manifest file path."
    }
}

function GetServiceName($manifestPath){
        $manifest = GetManifest($manifestPath)
        return $manifest.service.name
}

Result of the test

The script file contains two simple functions. The first one [GetManifest] versifies if the file exists and returns his content. Otherwise, an exception is thrown. The second function [GetServiceName] retreives the content of the XML file, and returns the value of the property [name] of the node [service].

The test file contains two tests. The first one Loads a reel manifest test that the manifest.xml file is correctly loaded and validates the content. The second test Return the service name, validate that the name of the service is the same as expected. We can now invoke the tests.

Invoke-Pester

ResultFirstPesterTest

As you can see, all tests succeeded and the output is really clear. Now what if the path passed in parameter doesn’t exist? Let assume that in our design, we wanted that the code to throw an exception. A good thing for us, Pester already has everything to manage exceptions.

Let’s do a test to validate this “requirement.” First, define a code block that calls [GetManifest] with a wrong path. Then pipe the result in the Pester command Should. If you add the filling code to the pesterDemo.tests.ps1, and invoke-Pester all tests should succeed.

It "Throw an exception when loads a manifest with invalid path." {
{$xml = GetManifest ".\WrongPath\manifest.xml" }  |  Should Throw
}

Test with Mock

Sometimes, we don’t want our tests to interact with reel components. That’s why mocks are so useful. Thankfully for us, creating a mock with Pester is a piece of cake. In all the previous tests, a real XML file was used. To make some validation on the content of the file, I could have as many different files as different scenarios, or simpler I could use some mock.

With Pester, we must set up a context where our mock will be present. Here is a test where I mock the XML file. As expected, any path will work since the file is not reel.

Context "Test with Mocks."{

    Mock GetManifest{return [xml]"<service><name>Employee Provider</name><version>1.2</version></service>"}

    It "Any path will works"{
        $xml =  GetManifest -MockWith ".\wrongPath\manifest.xml"
            $xml.OuterXMl | Should Be "<service><name>Employee Provider</name><version>1.2</version></service>"
    }
}

Wrapping up

Of course, this little introduction is not an exhaustive list of all the features of Pester. I hope that with these little code snippets, I gave you the drive you needed to test your PowerShell scripts. So next time you write this amazing script to deploy your app on Microsoft Azure, think of Pester.

References



~Frank


As strong as a team

cloudteamEverybody knows that we are stronger together. At work, we are grouped in small teams to accomplish more and... but wait, are we really working as one big strong force or are we working in a group but focusing on our own personal achievements? Let me share with you how as a team, we achieved what we thought was impossible.

The Context

The project was an application integrating different parts of the client's environment. A lot of technology was involved and of course, the schedule planned for it was really tight. We were a small team with different levels of experience.

The fall

As a team, we thought that splitting the job was the best option. Every member took a piece of the puzzle and started working on his side.
The juniors got themselves overwhelmed in complex patterns and best practices. They forgot the client's needs and tried to stubbornly use new technologies. The more experienced were pushed in this swirl of queries and technological challenges, lost their landmark, their concentration and the big picture.
The solution we were building was looking like an ugly patchwork that was not even functional. Continuing this way was a guarantee of failure.

The recovery

Of course, I wouldn't be writing this post if we did continue, instead we met and looked at what we could do better. Hopefully, for us, most of what we did was not a loss. If we re-focused, with a bit of a ramp-up and some extra hours put in, everything was still possible.
We did some lunch-and-learn to share the knowledge. We wrote templates to get all on the same base when we were starting new sections. We also produced short documents that explained how to get started with the concepts and technologies present in the project. The goal was to keep it short and simple so it will be readable at a glimpse.
However, all this would have been lost without code-review for everyone. Often code-reviews are perceived negatively. A review is the chance to improve not only "your" code but to standardize variable names, namespace structures, code patterns and to get to know the functions/ tools available.
Another valuable thing was our daily scrum meetings. That way, as soon as one team member had issues the best resources could jump-in to resolve the problems, or share his previous experience on how to solve this kind of complication.
With all these efforts we not only did we deliver a great solution, but we built something better: a team. Not only that people worked with each other, yet they were team-members that worked in synergy and that, is a great accomplishment.

The Lesson

I discovered something that I already know! A chain is only as strong as its weakest link. It sounds so simple but is it not always like that with the truth? Feel free to share what you did to improve your teamwork in the comment section.


~Frank

References

  • www.scrum.org

Localization from url with Asp.Net MVC 4



To offer a website in multiple languages using ASP.Net we simply need to add some resource.resx files to our project and voilà. Based on the language of the browser, IIS will match the localization resource. However, what if we want to give the choice to the user?

The goal

One pattern often used to achieve this, is to split the resource based on the URL.

Ex:
http://www.DomainName.com/EN/About for the English path
http://www.DomainName.com/FR/About for the French path

The needs

We will need something to grab the desired language from the URL and set it as the current language.

The Solution

First, let's add a new route. Asp.Net uses routing to parse the URL and to extract information from it, while keeping the displayed URL more user friendly. The default route is {controller}/{action}/{id}. So without a new route, our language will be treated as a Controller name; but that’s no good.
The route we want is {lang}/{controller}/{action}/{id}. We could just replace the default, but then what would happen to "normal" calls?

The best thing to do is to add our localized route first and keep the default second. The way the routing works, it will use the first route that matches. Since we will use the two-letter code for chosen language, lets add the first part {lang}, which must be two characters long to be considered valid. You can see this in the code where I define a constraint using the following Regex: "^[a-zA-Z]{2}$".

Class_RouteConfig
Now that we have the language, we must change the user interface culture of the current thread. I decided to use an Attribute, making it easy to use. Under the Filter folder add a new class which inherits from ActionFilterAttribute. This method will check if "lang" is available from RouteData. If lang is present it will change the currentUICulture of the current thread. If "lang" is not part of the URL, then it will set it to the default.

Class_LocalizationAttribute

We could put the [Localization] attribute on all controller classes, but a best practice is to create a BaseController class and use it there.

Class_BaseController

Voilà, you can now change the language of the entire website by changing the URL.

Bonus: Use it everywhere


To use it in the view and the controller it's easy; just add the namespace and just type Resources.NameOfYourResourceString.
To use it in the validation and the generated code from the model, we could use something like this:

Class_PersonModel

This way, in the view, the code will still stay clean and simple.

@Html.LabelFor(model.UserName)

I hope this post helps you in your development efforts. Any comments, suggestions and/or questions are welcome.
 

~Frank



How Windows Azure Simplified my Development

(This post was originaly published on Matricis Blog)

Usually my posts are a lot more technical, but this time I decided to share an experience with Windows Azure that saved me a lot of headaches while saving my boss a bunch of money ;)

The Context

Here at Matricis, we often set up our development environments on virtual machines (VMs) witch we host on our internal infrastructure. We have several different development VM configurations, based on the technologies and versions needed. A big advantage to doing so is that if the required environment changes, we simply choose the corresponding VM template. For the project I'm about to talk about, we needed quite a powerful development system, especially since every developer required Visual Studio 2012, SharePoint Foundation 2013, SQL Server, ADFS, and a handful of other tools (fiddler, notepad++, different browsers, etc.)

The Problem

To be able to develop with SharePoint, it is strongly recommended to have at least 8 gigabytes of RAM (I first tried with just 6 gigs, but it was still a nightmare). My laptop only has 8 gigs of memory, so I couldn't run the VM locally. I asked our IT guys if it was possible to host the VM on a local on-premise server. They answered that they didn't have enough space for the environment. They were very sorry, but I was actually quite happy about it; I now had a perfect use-case to work in Windows Azure!

The Solution

I went to see my boss and explained the situation: instead of buying a brand new server for development and test environment purposes, we should simply use Windows Azure’s IaaS! We could start setting up the VM in less than 10 minutes. In an hour we would be ready to code! The development VM would only be up while it was in use, meaning that it wouldn’t cost a cent while nobody was working on it. On project completion… we would delete all the VMs we were using, and no more fees! My boss loved the idea!

The core team for this project involved four full-time developers, and here is a high level look at our development environment: The Active Directory is shared but every developer has their own SharePoint, Sql Server, and ADFS making them autonomous.

Development environment


As you can see, it's a hybrid environment since the Team Foundation Server (TFS) is on one our local servers. In the morning, I start my VM and within a few minutes, I'm connected remotely and I’m ready to work on a great machine. With a little PowerShell script that I wrote, I don't even need to log in to the Azure Portal to start and stop my VM. Another great joy to this scenario is that I can now work from anywhere and on any kind of machine: from home on the family computer without VPN or from a Hotel on my laptop or my Surface Pro! Happiness often comes from simple things.

In general, I would say that the experience was very positive, but on the road we did encounter some issues that we had to resolve. Since all IPs on Azure are dynamic, we discovered that the domain controller that is hosted in Azure, must be started before the other VMs. This way its IP will always be the first one, therefore the other VMs will find it without any issues. Furthermore, in our architecture, the source-control (TFS) is on-premise. When you check-out or check-in your code, you are passing through the firewall. However, since these actions are intensive, the firewall may interpret the activity as attacks.

Because the job of a firewall is to protect your network, you can imagine what happened... the connection was lost. Once we identified this and created Firewall exceptions for the Azure VMs, everything was good.

In Conclution

I hope this post will encourage you to try Windows Azure as a development and test environment, because it's a really effective and cost-beneficial way to execute on different projects. For more information about the Windows Azure Infrastructure as a Service go to the Windows Azure Web Site.


~Frank


Quick trick: From Feedly to your eReader in one click

I’m reading most of the time in common transport and offline, on my eBook reader.  So part of my morning routine is to look for new articles to read, and send it to my reading device. Since this week, because of the new Readability updates, I can do this in only one click! Let me share this with you.

The Tooling


Readability


I’m using Readability to keep tracks of all the posts I’m reading.  It also very useful, because it cleans everything and keeps only the text of the article. Readability can also send you automatically post to your device.
post beforepost after in Readability


Readability extension for Chrome


I’m also using the Extension let you from anywhere send a page or a link to your reading list. No need to copy-paste the post’s URL. Since last week, options are also available in the context menu.
Readability extension for Chrome


Feedly


To keep tracks of all the RSS I’m following I use Feedly.  The interface is very polish and the application is always improving. The application is also available on iPhone and Android in addition to being a web app.
Feedly Interface


The Trick

Finally, because Feedly is running in a web browser, you can use the Readability extension. So from now on, when I found interesting article, I can left-click and send it to my eBook reader.

Enjoy!

Feedly_Readability_Kindle






References



How to copy blobs or VHDs between different Windows Azure subscription

(An updated version of this post is available in both English and français)

First of all, why would you want to copy Virtual Hard Drive (VHD) or a blob from a Windows Azure subscription to another? It could be for: doing backups, because your Windows Azure trial is ending, to get a copy of a client’s VM to investigate a problem. So, when one of my client asked me if it was possible to do it, my answer was: “Yes it is, using command line tools”. But since he was looking for a simple solution, I wrote him a little script that I will share here so everyone can enjoy it.
 

Get Started

To do the copy, you only need to do one PowerShell command. But, in order to execute this command, you need to have Windows Azure command-line tool already installed. At the end of this post, the script to install the Windows Azure command-line tool will be provided.
 

The command

Having installed the required tool, you can execute this one line of code from the Windows Azure command-line:
azure vm disk upload <source-path> <target-blob-url> <target-storage-account-key>

Get the source

In this command, you must replace <source-path> by the url of the VHD or blob that you want to copy from.
You can get this url through the Windows Azure Portal, using your account where you want to copy from.

  1. On the left side of the screen click on the Storage icon.
  2. Then click on the storage name.
  3. From the top of the screen click on Containers
  4. And when the container list appears click on the name of the container to get the details view.

2012-10-13_0757_-_Step_1-2

Get the destination

2012-11-11_0803Now that the "from" as been identified, we need to specify the “to”. We must replace the <target-blob-url> by the url of the Windows Azure Storage container in the destination Azure subscription. If the Blob container already exists, just connect to this account and follow the previous steps. Otherwise, you need to create a new one by using the “+” imbutton on the bottom left of the screen.

2012-11-11_0754_Public_ContainerThe easiest way is to set the container with a public access at the time of the transfer. You can set this option when creating or editing the container using the button at the bottom of the screen.

Then specify the access propriety to Public Container.






Get the key

2012-11-11_0821Last part but not the less important we must specify the storage account key and replace the <target-storage-account-key> with it. You can find it by accessing the Manage Keys button from the dashboard of the Storage. You can use either the primary or the secondary access key.



 


Install the Windows Azure command-line tool


2012-10-13_0826Here is a little script that you should put in a “.cmd” or “.bat” file. It will install the Windows Azure command-line tool with Node.js and Chocolatey. After running the script, a console window that looks like this should be open.










@powershell -NoProfile -ExecutionPolicy unrestricted -Command "iex ((new-object net.webclient).DownloadString('http://bit.ly/psChocInstall'))" && SET PATH=%PATH%;%systemdrive%\chocolatey\bin
@powershell "cinst nodejs.install" && SET PATH=%PATH%;%ProgramFiles(x86)%\nodejs
@powershell -ExecutionPolicy unrestricted "npm install azure -g" && SET PATH=%PATH%;%USERPROFILE%\AppData\Roaming\npm\
@powershell azure
pause

References





  • Windows Azure command-line tool
  • Node Packaged Modules or Node.js
  • Chocolatey 





  • Three New Advanced Windows Azure Virtual Workshops


    You did the boot camp and the regular workshops but you still looking for more?
    Be ready, starting October 15, a new advanced Windows Azure virtual workshop serie is starting. I will be there to answer all you questions in the chat with the Windows Azure Canadian Community Experts!

    Workshop #1 Cloud Variations

    October 15 – Cloud Variation

    Explore Windows Azure’s infrastructure options – Cloud Services, Virtual Machines, and Websites. Through comparing and contrasting the options, you will walk away from the workshop better equipped to choose the right option for different solutions.

    Register today: aka.ms/cloudvariations
     

    October 22 - Building Connected Apps with Windows Azure

    Discover how the various Windows Azure services can be integrated into apps to extend their functionality. You will learn how to address common needs such as computing capabilities, storage, authentication, and more. To demonstrate concepts, this workshop focuses on Windows 8 and Windows Phone, though concepts can be applied to any platform including Android and iOS.

    Register today: aka.ms/connectedappswithazure
     

    October 29 - Migrating Apps to Windows Azure

    Explore approaches to migrating applications, walk through concerns and considerations to take into account while planning a migration, and learn how to implement a migration plan to move applications from on-premises (or traditional hosting) to Windows Azure.

    Register today: aka.ms/migratingtoazure

     
    Go deeper and learn more! For more Info you can go on the Windows Azure Virtual Workshops.









    Azure Tools Belt: Auto-scaling Application Block – WASABi

    One of the questions that clients often ask is: What tools could be used for Windows Azure development. Everyone knows that you need a web browser and a code editor (ex: Visual Studio), but what else? So, I decided to do a serie of posts to present them. This second post is about the Auto-scaling Application Block or WASABi for friends.

    This serie is not meant to be an exhaustive list of all tools. Some other excellent tools are surely available. If you think I have forgotten one or want me to talk about one, let me know. I will be more than happy to adding it to the list.

    Already in the Azure Tool Belt:

    What is WASABi?

    To achieve elasticity until now, you needed to do it manually through the Azure management portal or writing your own code using the REST API. Using Wasabi, you just need to define some rules and the application will scale automatically: It can use a schedule or can be triggered by metrics, for example running fewer instances at night or adding extra instances if the CPUs are used at more than 80%.

    The auto-scaling application block can be hosted either in a Windows Azure role or in an on-premises application. The auto-scaling application is typically hosted in a separate application from the target application that you want to scale.

    Various scenarios are available to help you manage the auto-scaling by dynamically changing instance counts or performing application throttling of web/worker roles. The rules can auto scale based on timetables or metrics collected from the application and/or Windows Azure. You can even use notifications to preview any scaling operations before they take place and you can also use some PowerShell cmdlets to manage the autoscaler. You can constrain the auto scaling by:
    • Setting the instance counts upper and lower bounds
    • Preventing fast oscillations in the number of role instances with the stabilizer
    • Limiting scaling operations acknowledging billing hours

    How to use it

    For this demo we will use a simple Hello word application and scale it with a rule via a console application based on time.


    Step 1: Put the app in Azure

    clip_image001[4]To get started an Azure application is needed. Don’t forget to assign a certificate since the console application will need it.

    You can then publish the application on the cloud.





    Step 2: Adding the scaling application

    clip_image002[4]Now create a console application and name it AutoScalingConsole. Add the WASABi package by executing: Install-Package EnterpriseLibrary.WindowsAzure.Autoscaling.
    It should run without error and your Solution should look like this.







    Step 3: Add and configure the rules

    Add a new file call Rules.xml and set the property Copy to Output Directory: Copy always. Copy-paste this xml into the rules file.
    <?xml version="1.0" encoding="utf-8" ?>
    <rules xmlns="http://schemas.microsoft.com/practices/2011/entlib/autoscaling/rules">
      <constraintRules>
        <rule name="default" enabled="true" rank="1" description="The default constraint rule">
          <actions>
            <range min="1" max="1" target="AutoscalingApplicationRole"/>
          </actions>
        </rule>
        <rule name="peaktime" enabled="true" rank="10" description="Increase instance count at peak times">
          <timetable startTime="20:00:00"  duration="00:20:00" />
          <actions>
            <range min="2" max="4" target="AutoscalingApplicationRole"/>
          </actions>
        </rule>
      </constraintRules>
    </rules>
    

    It contains two rules: A default one that is always active, defining minimum and maximum instance counts of 1, and a second one used for scaling. The variable “Is rank” with a value of one means that it can be overridden by other constraint rules with a higher rank. . Naturally, if that rule is applied, there will be only a single instance of the role.

    The second rule is named peaktime. This rule has the same target, a higher rank, a minimum value of two, a maximum value of four. Also a timetable makes the rule active for 20 minutes, starting at 10 minutes from the current time.

    Step 4: Define the service model

    You will now add a new xml file called services.xml and set the property Copy to Output Directory: Copy always. Copy-paste this xml into the services file.
    <?xml version="1.0" encoding="utf-8" ?>
    <serviceModel xmlns="http://schemas.microsoft.com/practices/2011/entlib/autoscaling/serviceModel">
    
      <subscriptions>
        <subscription name="[yoursubscriptionname]"
                      certificateThumbprint="[yourmanagementcertificatethumbprint]"
                      subscriptionId="[yoursubscriptionid]"
                      certificateStoreLocation="CurrentUser" certificateStoreName="My">
          <services>
            <service dnsPrefix="[yourhostedservicednsprefix]" slot="Staging">
              <roles>
                <role alias="AutoscalingApplicationRole" roleName="AutoscalingApplicationRole" wadStorageAccountName="elazurehol"/>
              </roles>
            </service>
          </services>
          <storageAccounts>
            <storageAccount alias="elazurehol"
                            connectionString="DefaultEndpointsProtocol=https;AccountName=[yourstorageaccountname];AccountKey=[yourstorageaccountkey]">
            </storageAccount>
          </storageAccounts>
        </subscription>
      </subscriptions>
    </serviceModel>

    In this file, make the following changes:

    Replace [yoursubscriptionname] with the name of your Windows Azure subscription and [yoursubscriptionid] with your Windows Azure subscription ID.
    clip_image001

    Replace [yourmanagementcertificatethumbprint] with your Windows Azure management certificate thumbprint.
    clip_image002

    Replace [yourhostedservicednsprefix] with the URL prefix of your Windows Azure hosted service.

    Replace [yourstorageaccountname] with your Windows Azure storage account name and [yourstorageaccountkey] with your Windows Azure storage account primary access key.
    clip_image003

    Step 5: Configure the Auto-scaling Application Block

    Right-click on the App.config file in Solution Explorer, add one if needed, then click Edit Configuration File. In the Blocks menu, click Add Autoscaling Settings. Now set the rules.xml and services.xml as sources for Rules Store and Service Information Store. Via the File menu, Save then Exit.

    wasabidemo_4

    To by able to track evolution of the the testing let’s add some logging.  In Visual Studio, double-click on the App.config file to open it in the editor. Then add this system.diagnostics at the end of the file:
    <system.diagnostics>
       <sources>
          <source name="Autoscaling General"  switchName="SourceSwitch" switchType="System.Diagnostics.SourceSwitch" />
          <source name="Autoscaling Updates"  switchName="SourceSwitch" switchType="System.Diagnostics.SourceSwitch" />
       </sources>
       <switches>
          <add name="SourceSwitch" value="Verbose, Information, Warning, Error, Critical" />
       </switches>
    </system.diagnostics>

    Step 6: Try it

    You can now run the console application and observe how the auto-scaling rules work with the Azure application. Check the Output window in Visual Studio that logs which rules are being matched.


    Conclusion

    Using Wasabi makes your application elastic but doesn’t make your application scalable you must therefore design for scalability. If you have any comments, suggestions or experiences to share, feel free to let me know by adding a comment, by e-mail or by the contact page.
     

    Where to find More info?


    Note:


    ~Frank