Upgrade an Application Windows Azure OS Family

Recently I add to upgrade an web site running in Azure Webrole from Azure OS famille 1.6 to a more recent version. While the migration was not complicated I encounter some little particularity that I found could be interesting to share.

The Context

The website was a Visual Studio 2010 project using Azure SDK 1.6 and a library call AspNetProvider that was part of Microsoft's sample few years ago to manage session and membership. Using the AspNetProvider library the session was saved in Azure blob storage, and the membership was saved in an SQL database.

The Goal

The application must stay a Visual Studio 2010 project, but using the most-recent Azure SDK and Azure Storage Client as possible.

The Solution

  • Azure SDK 2.1
  • Azure.StorageClient 4.0
  • Universal Provider version 2.1
  • OS famille 4

The Journey


Migration from SDK 1.6 to SDK 2.1


Azure SDK version 2.1 is the higher version compatible with Visual Studio 2010. And can be downloaded from Microsoft's website. Once it is installed, just open the project in Visual Studio and right-click on the Azure Project. By clicking on the upgrade button the magic will happen. Some errors could stay but the hard work will be done for you.


Migration from AspNetProvider to UniversalProvider


we need to remove all reference to the AspNetProvider library. Just expand the resources node in the Solution Explorer and delete the reference. One thing important is that since we are using Visual Studio 2010 the latest version of the UniversalProvider we can use is 1.2. More recent version are using .Net 4.5 and this is not compatible with the present solution. To get the reference added to the project just execute the following Nugget command:
Install-Package UniversalProvider -version 1.2

Check the web.config file to clean the membership connections.

Migration of the Azure Storage Client


This one is the easiest, just remove the reference in the reference node and then execute the following Nugget Command:
Install-Package Azure.Storage.Client

Migration of the membership data


The AspNetProvider was using prefixed SQL tables: aspnet_user, aspnet_membership, etc. The new membership manager is using another sets of tables. We must migrate the data from one set to the other one. Here a SQL script that will to exactly that. The script can be run multiple times because it will only copie the unmoved data.
-- ========================================================
-- Description:    Migrate data from asp_* tables 
--                 to the new table used by Universal provider
-- ========================================================

DECLARE @CNT_NewTable AS INT
DECLARE @CNT_OldTable AS INT

-- --------------------------------------------------------
-- Applications -------------------------------------------

INSERT INTO dbo.Applications (ApplicationName, ApplicationId, Description)
    SELECT    n.ApplicationName, n.ApplicationId, n.Description 
    FROM    dbo.aspnet_Applications o 
    LEFT    JOIN dbo.Applications n ON o.ApplicationId = n.ApplicationId
    WHERE    n.ApplicationId IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Applications 
SELECT @CNT_OldTable = Count(1) from aspnet_Applications

PRINT 'Application Count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' = ' + CAST(@CNT_OldTable AS VARCHAR)

-- -------------------------------------------------------- 
-- Roles --------------------------------------------------

INSERT INTO dbo.Roles (ApplicationId, RoleId, RoleName, Description)
SELECT    o.ApplicationId, o.RoleId, o.RoleName, o.Description 
FROM    dbo.aspnet_Roles o
LEFT JOIN dbo.Roles n ON o.RoleId = n.RoleId
WHERE n.RoleId IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Roles 
SELECT @CNT_OldTable = Count(1) from aspnet_Roles

PRINT 'Roles Count : ' + CAST(@CNT_NewTable AS VARCHAR) + ' = ' + CAST(@CNT_OldTable AS VARCHAR)

-- --------------------------------------------------------
-- Users --------------------------------------------------

INSERT INTO dbo.Users (ApplicationId, UserId, UserName, IsAnonymous, LastActivityDate)
SELECT o.ApplicationId, o.UserId, o.UserName, o.IsAnonymous, o.LastActivityDate 
FROM dbo.aspnet_Users o LEFT JOIN dbo.Users n ON o.UserId = n.UserID 
WHERE n.UserID IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Users 
SELECT @CNT_OldTable = Count(1) from aspnet_Users

PRINT 'Users count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)

-- --------------------------------------------------------
-- Memberships --------------------------------------------

INSERT INTO dbo.Memberships (ApplicationId, UserId, Password, 
PasswordFormat, PasswordSalt, Email, PasswordQuestion, PasswordAnswer, 
IsApproved, IsLockedOut, CreateDate, LastLoginDate, LastPasswordChangedDate, 
LastLockoutDate, FailedPasswordAttemptCount, 
FailedPasswordAttemptWindowStart, FailedPasswordAnswerAttemptCount, 
FailedPasswordAnswerAttemptWindowsStart, Comment) 

SELECT o.ApplicationId, o.UserId, o.Password, 
o.PasswordFormat, o.PasswordSalt, o.Email, o.PasswordQuestion, o.PasswordAnswer, 
o.IsApproved, o.IsLockedOut, o.CreateDate, o.LastLoginDate, o.LastPasswordChangedDate, 
o.LastLockoutDate, o.FailedPasswordAttemptCount, 
o.FailedPasswordAttemptWindowStart, o.FailedPasswordAnswerAttemptCount, 
o.FailedPasswordAnswerAttemptWindowStart, o.Comment 
FROM dbo.aspnet_Membership o
LEFT JOIN Memberships n ON  o.ApplicationId = n.ApplicationId
                      AND o.UserId = n.UserId
WHERE n.UserId IS NULL AND n.ApplicationId IS NULL


SELECT @CNT_NewTable = Count(1) from dbo.Memberships 
SELECT @CNT_OldTable = Count(1) from aspnet_Membership

PRINT 'Memberships count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)

-- -------------------------------------------------------
-- UsersInRoles ------------------------------------------
TRUNCATE TABLE dbo.UsersInRoles
INSERT INTO dbo.UsersInRoles SELECT * FROM dbo.aspnet_UsersInRoles


SELECT @CNT_NewTable = Count(1) from dbo.UsersInRoles 
SELECT @CNT_OldTable = Count(1) from aspnet_UsersInRoles

PRINT 'UsersInRoles count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)


Migration from OSFamilly 1 to 4

Open the file .cscfg and edit the OS Family attribute. It's in the ServiceConfiguration node.
<ServiceConfiguration servicename="MyApp" osFamily="4" osVersion="*" ...>    


Wrapping up

The only step left is to deploy in the staging environment to see if everything is working as expected. would recommend also to plan to upgrade as soon as possible because the Azure SDK 2.1 official retirement date is November 2015. I hope this post could help you, even if you are migrating from and to a different version. Any comments, suggestions and/or questions are welcome.


~ Frank Boucher


Reading Notes #163

image from Microsoft Connect (on Channel9)Suggestion of the week


Cloud


Programming

  • .NET Core is Open Source - Get a better understanding of what is .net Core and the meaning / purpose of the open source announce.
  • The Roadmap for WPF - This post gives all the details about what is coming next for Windows Presentation Foundation (WPF).

Miscellaneous



~Frank


Reading Notes #162

CloudBusSuggestion of the week


Cloud


Programming


Miscellaneous


~Frank Boucher


Reading Notes #161

grunge-leaf-1434301-mCloud


Programming


Miscellaneous


~Frank Boucher



The making of: Franky's Notes Azure Search - part 2

This post concludes The making of: Franky’s Notes Azure Search. In the previous post, I build a console application in .Net using the RedDog.Search library, to populate an index in my Azure Search Service with my notes.
In this post, I’m sharing with you how I created the user interface to query my notes. To know more about the Azure Search REST API, all the documentation is available online.

Objectives


For this part of the project, we will use the azure-search javascript client of Richard Astbury available on Github. The idea is to build a nice user interface (UI) that will provide a simple and efficient way to search. Since the code will be in JavaScript, it’s strongly suggested to use a query key instead of a master key. These keys can be managed from the Azure Portal.

Azure Search Query Keys

Creating the Interface


First, we need to get the azure-search. To get it, you can whether download the file azure-search.min.js on Github or by execute npm install azure-search from a Node.js console.
Now we need a simple HTML page with a form, a textbox and a button.
    <html>
        <head>
            <title>Search</title>
            <link  href="css/bootstrap.min.css" rel="stylesheet">
            <!--[if lt IE 9]>
                <script src="scripts/html5shiv.min.js"></script>
                <script src="scripts/respond.min.js"></script>
            <![endif]-->
        </head>
        <body>
            <form>
                <label>Search</label>
                <input id="txtSearch" placeholder="Search">
                <button id="btnSearch" type="button">Search</button>
            </form>

            <div id="result"></div>

            <script src="scripts/jquery.min.js"></script>
            <script src="scripts/bootstrap.min.js"></script>
            <script src="scripts/azure-search.min.js"></script>
            <script>

                var client = AzureSearch({
                  url: "https://frankysnotes.search.windows.net",
                  key:"DB7B9D1C53EC08932D8A8D5A1406D8CA" // - query only
                });

            </script>
        </body>
    </html> 
As you can see I’m creating the AzureSearch client using my query key from before. Afterwards, we create a Search function to retrieve the search criteria from the textbox and pass it to the client. A dynamic function is used as a callback that receives the parameter noteList which is an array of matching documents. We finally just need to loop through the result to build a nice output.
    function Search(){

        var _searchCriteria = $("#txtSearch").val();   
        var _objSearch = {search: _searchCriteria, $orderby:'title desc'};

        client.search('notes', _objSearch, function(err, noteList){
            var $divResult = $("div#result");
            $divResult.html( "<div class='panel-heading'><h3 class='panel-title'>" + noteList.length + " Result(s)</h3></div><div class='panel-body'>" );

            if(noteList.length > 0){

                var _strResult = "";
                _strResult = "<ul class='list-group'>";

                for(var key in noteList){
                    var fNote = noteList[key];

                    _strResult += = "<li class='list-group-item'><a href='" + fNote.url + "' target='_blank'>" + fNote.title + "</a><p>" + fNote.note + "</p></li>";
                }

                _strResult += "</ul></div>";
                $divResult.append( _strResult );
            }
      });
    }

If we put all this together, we got a nice result.

Franky's Notes Search UI

Live Demo

Conclusion


I really had a lot of fun creating these little applications. I found the client incredibly easy to use. I hope it will help you to get ideas and moving forward to use Azure Search. Any comments, suggestions and/or questions are welcome.


~ Frank Boucher


References


Reading Notes #160

multipleCloudStorageSuggestion of the week


Cloud


Programming

  • Don't Frown on CSVs - Nice post that explains why we shouldn't over look the good old CSV format.

Miscellaneous


~Frank Boucher


The making of: Franky's Notes Azure Search - part 1


For a long time now, I'm thinking about creating an API that will allow to search easily through my notes. When Azure Search came public few weeks ago, I knew it was what this project needed to come alive. In this post, I will share how I did it, and more importantly, show how incredibly easy it was to do.


What's Azure Search?


Currently in preview, Azure Search is a cloud-based search-as-a-service that provides a set of REST APIs defined in terms of HTTP requests and responses, in OData JSON format.

Getting Started


From the Azure Portal, let's create an Azure Search Service by clicking the plus button on the bottom left of the screen. Select the Search option, and fill-up the options.

Azure_portal_crete_Search_Service_2014-10-20_0931

Application to populate my Azure Search service


First, we will need some data. My weekly posts Reading Notes are generated with a Ruby script that I did few years ago. You can read more about it on First step with Ruby: Kindle Clipping Extractor. Basically, the script extracts my notes from my Kindle and build a collection of notes grouped in different categories to generate a markdown file. That can easily be done by adding a new Json output file. Here is a quick view this output.
{
  "json_class": "FrankyNotes",
  "categories": {
    "dev": [
      {
        "id": 77077357,
        "title": "Customize the MVC 5 Application Users’ using ASP.Net Identity 2.0",
        "author": "Dhananjay kumar",
        "url": "http://debugmode.net/2014/10/01/customize-the-mvc-5-application-users-using-asp-net-identity-2-0/",
        "note": "Need to get the fukk article",
        "tags": "dev,frankysnotes,readingnotes160",
        "date": "2014/10/17",
        "category": "dev"
      },
      {
        "id": 77156372,
        "title": "Custom Login Scopes, Single Sign-On, new ASP.NET Web API – updates to 
      [...]

Now that we have some data, we need to create an index and be able to add document in it. A console application will be perfect for this job. At the time of writing this post, two libraries exist to interact with the Microsoft Azure Search REST API. For this part of the project, we will use the RedDog.Search library available on Github, since it's a .Net library.

Note: To create an index or upload documents you will need an admin key.

Admin_Key

First, we need to create an Index. Let's keep it simple and just create the index with all the properties of the json object. Here the code of my function CreateNoteIndex.
public IndexManagementClient Client
{
    get
    {
        if (_client == null){
            _client = new IndexManagementClient(ApiConnection.Create("frankysnotes", "AdminKey"));
        }
        return _client;
    }
}

public async Task<string> CreateNoteIndex()
{
    var createResult = await Client.CreateIndexAsync(new Index("notes")
        .WithStringField("id", opt => opt.IsKey().IsRetrievable())
        .WithStringField("title", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("author", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("url", opt => opt.IsRetrievable().IsSearchable(false))
        .WithStringField("note", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("tags", opt => opt.IsRetrievable().IsFilterable().IsSearchable())
        .WithStringField("date", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("category", opt => opt.IsRetrievable().IsFilterable().IsSearchable())
        );
    if (createResult.IsSuccess)
    {
        return "Index Reseted successfully";
    }
}

To be able to search by note instead of by post, I decided to break down the file in multiple documents containing one note by document. After what, it was really easy to upload the documents into the index.
public async Task<string> AddNotes(string filepath)
{
    var docs = new List<IndexOperation>();
    FrankysNotes notes = DeserializeFNotes(filepath);

    foreach (var category in notes.categories)
    {
        foreach (var fNote in notes.categories[category])
        {
            var doc = ConvertfNote(fNote);
            docs.Add(doc);
        }
    }

    var result = await Client.PopulateAsync("notes", docs.ToArray<IndexOperation>());

    return "File uploaded successfully";
}


private FrankysNotes DeserializeFNotes(string filepath)
{
    var jsonStr = File.ReadAllText(filepath);
    var serializer = new JavaScriptSerializer();

    var notes = serializer.Deserialize<FrankysNotes>(jsonStr);
    return notes;
}

private IndexOperation ConvertfNote(FrankysNote fnote)
{
    var doc = new IndexOperation(IndexOperationType.Upload, "id", fnote.id)
                    .WithProperty("title", fnote.title)
                    .WithProperty("author", fnote.author)
                    .WithProperty("url", fnote.url)
                    .WithProperty("note", fnote.note)
                    .WithProperty("tags", fnote.tags)
                    .WithProperty("date", fnote.date)
                    .WithProperty("category", fnote.category);
    return doc;
}

To keep the code as clear as possible, I removed all validations and error management. The json file is deserialized, then looping through all notes I build a list of IndexOperation. And Finally I upload all the notes with Client.PopulateAsync("notes", docs.ToArray<IndexOperation>());

Wrapping up


Using the RedDog.Search library to push documents in Azure Search Index was extremely easy. In fact, it's that simplicity that pushed me to share my discovery. In the next part of the series, I will create a simple HTML page to do real query.

Stay tune...

~ Frank Boucher

References

Reading Notes #159

AzureConf2014Suggestion of the week

 

Cloud

 

Programming

 

Database

 

Miscellaneous

~Frank Boucher

Reading Notes #158

ButtonsamplepngSuggestion of the week


Cloud

  • Designing for Big Scale in Azure (K. Dotchkoff) - Nice post that explains how we should change our design in the cloud and use logical container or scale units.

Programming

  • Building a Better NuGet (Edward Charbeneau) - Nice post that gives us the best practices when developing a NuGet package.

Miscellaneous


~Frank