Reading Notes #168

poteauSuggestion of the week


Cloud


Programming


Miscellaneous


See you in 2015!


~Frank B


Reading Notes #167

Suggestion of the week


Cloud


Programming

  • .NET Blog (Christiaan Rakowski, Linux Al, Ron, Nathan Dimitriades, Warren, EShy, Thomas Levesque, Svolo4, Jeremiah Gowdy, les, James S, Srigopal C Immo Landwerth [MSFT]) - Very interesting and complete post that explains the goals behind the .Net Core and how it is useful to us.

Books

1897OT_DynamoDB Applied Design Patterns_1DynamoDB Applied Design Patterns
By Uchit Vyas, Prabhakaran Kuppusamy
Publisher: Packt Publishing
Released: September 2014
ISBN 13: 9781783551897

This book takes you where you are and brings you directly in the cumulus. It starts gently by explaining the basic of the NoSQL, then move to DynamoDB fundamental: what is a data model and how to create a table.

Chapter after chapter, the book moves you to the next level by bringing new complexities and explains how to manage them with his simple but complete scenario. All the code samples are in Java and extremely well explained. With my .Net background, I always understood.

The book ends by presenting many best practices to get the best of DynamoDB, and compare it to other popular NoSQL databases.

I will definitely recommend this book to all people looking for a NoSQL database. DynamoDB is unavoidable when we are looking for a no-relational database, and this book is a must to have in our bookshelf.

Miscellaneous



~Frank B


Stop losing time installing your software

Automation is the key to many hard situations. Everyone knows that, yet a lot of tasks are done manually. In this post, I will share with you some easy steps to install all your applications.

Blue Gears

Let's get started


For those how follow this blog I talked many times about Chocolatey. For all the others, Chocolatey is a Machine Package Manager, somewhat like apt-get (on Linux), but built with Windows in mind.
Assuming that you don't have Chocolatey installed, let's start by that. Open a command prompt (cmd.exe) as Administrator, and execute this command:
@powershell -NoProfile -ExecutionPolicy unrestricted -Command "iex ((new-object net.webclient).DownloadString('https://chocolatey.org/install.ps1'))" && SET PATH=%PATH%;%ALLUSERSPROFILE%\chocolatey\bin
Now anytime you need an application you can simply do a choco install command. This will download the latest version of the package online the last package, and install it on your PC.
Here are some examples with popular applications:

Application Command
Notepad++ choco install notepadplusplus
Atom choco install atom
7-Zip choco install 7zip
Skype choco install skype

You can find the complete list of all applications on the Chocolatey web site. Many other commands are also available to search, list and update some packages:

Command Description
choco install atom -Version 0.140.0 Install old version (0.140.0) of Atom
choco list nunit Show all package that contain nUnit
choco update updates chocolatey to the latest version
choco update notepadplusplus updates NotePad++ to the latest version


Let's go a step further


Will it be nice if we could chain them? This is possible with Boxstarter, a repeatable, reboot resilient windows environment installations using Chocolatey packages. To install Fiddler, Atom, and Visual Studio 2013, simply type this command in Internet Explorer (IE) and Boxstarter will starts his magic.
http://boxstarter.org/package/fiddler4,atom,visualstudiocommunity2013


Note: This will work only in Internet Explorer (IE), on Chrome or Firefox you will need a "Click-Once" extension.



Turn it up to 11


Now that we know we can chain them, what is stooping us to create a script that will install ALL our favorite applications? Well, nothing!

Boxstarter is a very easy and powerful way to automate the installation of software. Here an example to show some of the great features supported by Boxstarter. In the following script I will: configure Windows, install the latest Windows updates, install Visual Studio Community 2013 with an extension and install many other applications.

# Windows Configuration
Update-ExecutionPolicy Unrestricted
Set-ExplorerOptions -showHidenFilesFoldersDrives -showProtectedOSFiles -showFileExtensions
Enable-RemoteDesktop
Disable-InternetExplorerESC
Disable-UAC #Win8
Set-TaskbarSmall

if (Test-PendingReboot) { Invoke-Reboot }

# Update Windows and reboot if necessary
Install-WindowsUpdate -AcceptEula
if (Test-PendingReboot) { Invoke-Reboot }

# Install Visual Studio Community 2013
choco install choco install visualstudiocommunity2013

# VS extensions
Install-ChocolateyVsixPackage PowerShellTools http://visualstudiogallery.msdn.microsoft.com/c9eb3ba8-0c59-4944-9a62-6eee37294597/file/112013/6/PowerShellTools.vsix


# Install WebPI (Platform Installer) 
choco install webpi

# Install
#  . SSDT
#  . Microsoft Azure SDK - 2.4.1
#  . Microsoft Azure SDK for .NET (VS 2012) - 2.4
C:\Program Files\Microsoft\Web Platform Installer>WebpiCmd.exe /Applications: SSDT, WindowsAzureSDK_2_4_1, VWDOrVs2012AzurePack.2.4


if (Test-PendingReboot) { Invoke-Reboot }

# Install Favourite Tools
choco install sourcetree
choco install resharper
choco install Atom
choco install LinqPad
choco install fiddler4

You can now execute this script executing the command in IE:
http://boxstarter.org/package/url?C:/dev/Demoscript.txt 
In this case, the file was on a local folder, put it could have been in a share folder on another server, or somewhere online like in gist.github.com.

Wrapping up


I hope it will help you to automate repetitive task and save time. For more information about Chocolatey or Boxstarter, go visit their respective web site. Any comments, suggestions and/or questions are welcome.


~Frank B

Reading Notes #166

IMG_20141123_093027577_HDRSuggestion of the week


Cloud


Programming


Miscellaneous


~Frank B.


Reading Notes #165

 

OmnisharpSuggestion of the week

 

Cloud

Programming

Integration

  • Azure BizTalk Services: An Introduction - This is the second post of a series on BizTalk. We are still in an introduction mode, but this is good, BizTalk is not a little application or system, and in this time of the Internet of things, it's good to have the good tools.
~ Frank B

Toby, did you see what I just did!

Today, I was running around with my laptop trying to find someone to show him what I did. My problem was that since I'm working from home, I found no one except my dog... Toby, did you see what I just did! He was looking at me and didn't really care about that I was doing some C# in Atom, a regular text editor. So, here I am now, sharing my discovery with you.

The "What"

While reading some article on the Internet, I fall on a video talking about OmniSharp.
A family of Open Source projects, each with one goal - To enable great .NET development in YOUR editor of choice.
SO I decide to give it a try in one of my favorite text editors this time called Atom.
Less than two minutes after, I was running across my house....

The "How"

What I like about Atom that it is so easy to install and to customize. The easiest way to install it is via Chocolatey.
Chocolatey NuGet is a Machine Package Manager, somewhat like apt-get, but built with Windows in mind.
Assuming that you don't have Chocolatey installed, let's start by that. Open a command prompt (cmd.exe) as Administrator, and execute this command:
@powershell -NoProfile -ExecutionPolicy unrestricted -Command "iex ((new-object net.webclient).DownloadString('https://chocolatey.org/install.ps1'))" && SET PATH=%PATH%;%ALLUSERSPROFILE%\chocolatey\bin

In a new command prompt again with the administrator permission, let's Install the text editor Atom:
cinst Atom

and Git:

cinst Git

Now to install the OmniSharp in Atom you have two options. You could do it through the Settings or using a console. For this post, I will use the second option. Let's open a new command prompt, always as Administrator.
The reason why I use a new prompt every time is to be sure that environment variable gets refreshed.

Execute these command:
apm install language-csharp
apm install autocomplete-plus-async
apm install omnisharp-atom

Now open Atom and let's put some code:
using System;

namespace ConsoleAppDemo
{
    class Program
    {
        static void Main(string[] args)
        {
          var myBook = new Book{Title="Get Started with OmniSharp"};
          Console.WriteLine(String.Format("Here is my review of: {0}", myBook.Title));
        }
    }

    public class Book
    {
      private string _review;

      public string Title{get;set;}

      public string Review{
        get{
          if(String.IsNullOrEmpty(_review))
          {
            _review = "This book is nice";
          }
          return _review;
        }
        set{
          _review = value;
        }
      }
    }
}

Nothing special until you start the OmniSharp server with Ctrl-Alt-o.

Boom!

Atom_Intellicnse


As you can see now the intelisense, completion, code navigation and so more! If you click on the little green flame on the bottom left you see details about notification and error message.

notification


The end


OmniSharp is a frontend compiler, not a complete compiler so it doesn't generate CLI code. But It's already a lot. Today, you can use OmniSharp with Sublime Text 3, Atom, Emacs, Brackets, Vim and with command line only. So whether on your old laptop or your new PC, whether you run Linux, Mac or Windows let's do some C#!


~Frank Boucher


Four Simple Tips to Improve your Asp.Net MVC Project

When it's time to do the re-factoring of a solution, it's always a good idea to clean the code before doing any re-factoring. In this post, I will share with you simple but very efficient ways to improve your solution.

1- Forget the magic string


By default in Asp.Net MVC magic strings are used every where.
return View("Index");
or
@Html.ActionLink("Delete Dinner", "Delete", "Dinners", new { id = Model.DinnerID }, null)

Nothing bad here, but nothing will tell you that to did a typo, or that the method name as changed. This is where T4MVC will become a great tool do add to all your project.

To add it a simple Nuget command is enough: Install-Package T4MVC. By doing this a T4 file (T4MVC.tt) will be added to your project that will generates a number of files. These generated files will simplify your life and gives you the opportunity to code using strongly type.

Here are few transformations:
// Before ----------------
  return View("Index");

// After with T4MVC
  return View(Views.Index);
An action link in a view.
// Before ----------------
  @Html.ActionLink("Delete Product", "Delete", "Products", new { id = Model.ProductID }, null)

// After with T4MVC
  @Html.ActionLink("Delete Product", MVC.Products.Delete(Model.ProductID))

An Ajax call.
// Before ----------------
<%= Ajax.ActionLink( "RSVP for this event",
                 "Register", "RSVP",
                 new { id=Model.DinnerID }, 
                 new AjaxOptions { UpdateTargetId="rsvpmsg", OnSuccess="AnimateRSVPMessage" }) %>

// After with T4MVC
<%= Ajax.ActionLink( "RSVP for this event",
                 MVC.RSVP.Register(Model.DinnerID),
                 new AjaxOptions { UpdateTargetId="rsvpmsg", OnSuccess="AnimateRSVPMessage" }) %>

A redirection.
// Before ----------------
return RedirectToAction("Details", new { id = product.ProductID });

// After with T4MVC
return RedirectToAction(MVC.Products.Details(product.ProductID));

When writing the code, it gives you intellisense where you normally would not have any. At compile time, it validates all the code so no typo or other misspelling errors are present.

2- Clean your views


You know all those "@using" on the top of each views that we copy over and over... It's time to remove them. The way to do it is by moving them to the web.config file in the "Views" folder.

web.config location

So you can move the namespaces used globally
@using Microsoft.Security.Application
@using System.Globalization;

by including them to this section:
<system.web.webPages.razor>
  <pages pageBaseType="System.Web.Mvc.WebViewPage">
    <namespaces>
      <add namespace="System.Web.Mvc" />
      <add namespace="System.Web.Mvc.Ajax" />

      <add namespace="Microsoft.Security.Application" />
      <add namespace="System.Globalization" />

    </namespaces>
  </pages>
</system.web.webPages.razor>

3- Don't lose time debugging


To many people are losing time debugging their application or web site. Start using Glimpse right away! This will provide information in real time across all layers of your application from the UI to the server and database side. Perfect to know everything that happen on a click of a button: javascript validation, controller code and even the query in the database.







Install it in ten seconds with the Nuget command manager and pick the version you need.

PM> Install-Package Glimpse



Glimpse is secure and is configured to be accessible only from localhost by default. But don't trust me and try it by yourself, or go check this one minute Glimpse Heads Up Display youtube video.

4- Start monitoring your website health and usage


Move your website on Microsoft Azure and use the Application Insights. This will gives you the opportunity to monitor that availability, performance and usage of your live application.

Add Application Insights


To add it you got many possibilities, one of them from Visual Studio 2013, just right-click on the project and select Add Application Insights Telemetry, and voilà!

Now you just need to run or deploy the website and after few minutes or so you will have plenty of information, graphs waiting for you in the Azure Portal.



Informations



You will find a lot of information about Application Insights on the Microsoft Azure


Wrapping up

I hope it will help you, thanks for reading. Any comments, suggestions and/or questions are welcome.

~ Frank Boucher


References



Reading Notes #164

happy-movember-magnet

Suggestion of the week


Cloud


Programming

 

Miscellaneous

~Frank B

Upgrade an Application Windows Azure OS Family

Recently I add to upgrade an web site running in Azure Webrole from Azure OS famille 1.6 to a more recent version. While the migration was not complicated I encounter some little particularity that I found could be interesting to share.

The Context

The website was a Visual Studio 2010 project using Azure SDK 1.6 and a library call AspNetProvider that was part of Microsoft's sample few years ago to manage session and membership. Using the AspNetProvider library the session was saved in Azure blob storage, and the membership was saved in an SQL database.

The Goal

The application must stay a Visual Studio 2010 project, but using the most-recent Azure SDK and Azure Storage Client as possible.

The Solution

  • Azure SDK 2.1
  • Azure.StorageClient 4.0
  • Universal Provider version 2.1
  • OS famille 4

The Journey


Migration from SDK 1.6 to SDK 2.1


Azure SDK version 2.1 is the higher version compatible with Visual Studio 2010. And can be downloaded from Microsoft's website. Once it is installed, just open the project in Visual Studio and right-click on the Azure Project. By clicking on the upgrade button the magic will happen. Some errors could stay but the hard work will be done for you.


Migration from AspNetProvider to UniversalProvider


we need to remove all reference to the AspNetProvider library. Just expand the resources node in the Solution Explorer and delete the reference. One thing important is that since we are using Visual Studio 2010 the latest version of the UniversalProvider we can use is 1.2. More recent version are using .Net 4.5 and this is not compatible with the present solution. To get the reference added to the project just execute the following Nugget command:
Install-Package UniversalProvider -version 1.2

Check the web.config file to clean the membership connections.

Migration of the Azure Storage Client


This one is the easiest, just remove the reference in the reference node and then execute the following Nugget Command:
Install-Package Azure.Storage.Client

Migration of the membership data


The AspNetProvider was using prefixed SQL tables: aspnet_user, aspnet_membership, etc. The new membership manager is using another sets of tables. We must migrate the data from one set to the other one. Here a SQL script that will to exactly that. The script can be run multiple times because it will only copie the unmoved data.
-- ========================================================
-- Description:    Migrate data from asp_* tables 
--                 to the new table used by Universal provider
-- ========================================================

DECLARE @CNT_NewTable AS INT
DECLARE @CNT_OldTable AS INT

-- --------------------------------------------------------
-- Applications -------------------------------------------

INSERT INTO dbo.Applications (ApplicationName, ApplicationId, Description)
    SELECT    n.ApplicationName, n.ApplicationId, n.Description 
    FROM    dbo.aspnet_Applications o 
    LEFT    JOIN dbo.Applications n ON o.ApplicationId = n.ApplicationId
    WHERE    n.ApplicationId IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Applications 
SELECT @CNT_OldTable = Count(1) from aspnet_Applications

PRINT 'Application Count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' = ' + CAST(@CNT_OldTable AS VARCHAR)

-- -------------------------------------------------------- 
-- Roles --------------------------------------------------

INSERT INTO dbo.Roles (ApplicationId, RoleId, RoleName, Description)
SELECT    o.ApplicationId, o.RoleId, o.RoleName, o.Description 
FROM    dbo.aspnet_Roles o
LEFT JOIN dbo.Roles n ON o.RoleId = n.RoleId
WHERE n.RoleId IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Roles 
SELECT @CNT_OldTable = Count(1) from aspnet_Roles

PRINT 'Roles Count : ' + CAST(@CNT_NewTable AS VARCHAR) + ' = ' + CAST(@CNT_OldTable AS VARCHAR)

-- --------------------------------------------------------
-- Users --------------------------------------------------

INSERT INTO dbo.Users (ApplicationId, UserId, UserName, IsAnonymous, LastActivityDate)
SELECT o.ApplicationId, o.UserId, o.UserName, o.IsAnonymous, o.LastActivityDate 
FROM dbo.aspnet_Users o LEFT JOIN dbo.Users n ON o.UserId = n.UserID 
WHERE n.UserID IS NULL

SELECT @CNT_NewTable = Count(1) from dbo.Users 
SELECT @CNT_OldTable = Count(1) from aspnet_Users

PRINT 'Users count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)

-- --------------------------------------------------------
-- Memberships --------------------------------------------

INSERT INTO dbo.Memberships (ApplicationId, UserId, Password, 
PasswordFormat, PasswordSalt, Email, PasswordQuestion, PasswordAnswer, 
IsApproved, IsLockedOut, CreateDate, LastLoginDate, LastPasswordChangedDate, 
LastLockoutDate, FailedPasswordAttemptCount, 
FailedPasswordAttemptWindowStart, FailedPasswordAnswerAttemptCount, 
FailedPasswordAnswerAttemptWindowsStart, Comment) 

SELECT o.ApplicationId, o.UserId, o.Password, 
o.PasswordFormat, o.PasswordSalt, o.Email, o.PasswordQuestion, o.PasswordAnswer, 
o.IsApproved, o.IsLockedOut, o.CreateDate, o.LastLoginDate, o.LastPasswordChangedDate, 
o.LastLockoutDate, o.FailedPasswordAttemptCount, 
o.FailedPasswordAttemptWindowStart, o.FailedPasswordAnswerAttemptCount, 
o.FailedPasswordAnswerAttemptWindowStart, o.Comment 
FROM dbo.aspnet_Membership o
LEFT JOIN Memberships n ON  o.ApplicationId = n.ApplicationId
                      AND o.UserId = n.UserId
WHERE n.UserId IS NULL AND n.ApplicationId IS NULL


SELECT @CNT_NewTable = Count(1) from dbo.Memberships 
SELECT @CNT_OldTable = Count(1) from aspnet_Membership

PRINT 'Memberships count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)

-- -------------------------------------------------------
-- UsersInRoles ------------------------------------------
TRUNCATE TABLE dbo.UsersInRoles
INSERT INTO dbo.UsersInRoles SELECT * FROM dbo.aspnet_UsersInRoles


SELECT @CNT_NewTable = Count(1) from dbo.UsersInRoles 
SELECT @CNT_OldTable = Count(1) from aspnet_UsersInRoles

PRINT 'UsersInRoles count: ' + CAST(@CNT_NewTable AS VARCHAR) + ' >= ' + CAST(@CNT_OldTable AS VARCHAR)


Migration from OSFamilly 1 to 4

Open the file .cscfg and edit the OS Family attribute. It's in the ServiceConfiguration node.
<ServiceConfiguration servicename="MyApp" osFamily="4" osVersion="*" ...>    


Wrapping up

The only step left is to deploy in the staging environment to see if everything is working as expected. would recommend also to plan to upgrade as soon as possible because the Azure SDK 2.1 official retirement date is November 2015. I hope this post could help you, even if you are migrating from and to a different version. Any comments, suggestions and/or questions are welcome.


~ Frank Boucher


Reading Notes #163

image from Microsoft Connect (on Channel9)Suggestion of the week


Cloud


Programming

  • .NET Core is Open Source - Get a better understanding of what is .net Core and the meaning / purpose of the open source announce.
  • The Roadmap for WPF - This post gives all the details about what is coming next for Windows Presentation Foundation (WPF).

Miscellaneous



~Frank


Reading Notes #162

CloudBusSuggestion of the week


Cloud


Programming


Miscellaneous


~Frank Boucher


Reading Notes #161

grunge-leaf-1434301-mCloud


Programming


Miscellaneous


~Frank Boucher



The making of: Franky's Notes Azure Search - part 2

This post concludes The making of: Franky’s Notes Azure Search. In the previous post, I build a console application in .Net using the RedDog.Search library, to populate an index in my Azure Search Service with my notes.
In this post, I’m sharing with you how I created the user interface to query my notes. To know more about the Azure Search REST API, all the documentation is available online.

Objectives


For this part of the project, we will use the azure-search javascript client of Richard Astbury available on Github. The idea is to build a nice user interface (UI) that will provide a simple and efficient way to search. Since the code will be in JavaScript, it’s strongly suggested to use a query key instead of a master key. These keys can be managed from the Azure Portal.

Azure Search Query Keys

Creating the Interface


First, we need to get the azure-search. To get it, you can whether download the file azure-search.min.js on Github or by execute npm install azure-search from a Node.js console.
Now we need a simple HTML page with a form, a textbox and a button.
    <html>
        <head>
            <title>Search</title>
            <link  href="css/bootstrap.min.css" rel="stylesheet">
            <!--[if lt IE 9]>
                <script src="scripts/html5shiv.min.js"></script>
                <script src="scripts/respond.min.js"></script>
            <![endif]-->
        </head>
        <body>
            <form>
                <label>Search</label>
                <input id="txtSearch" placeholder="Search">
                <button id="btnSearch" type="button">Search</button>
            </form>

            <div id="result"></div>

            <script src="scripts/jquery.min.js"></script>
            <script src="scripts/bootstrap.min.js"></script>
            <script src="scripts/azure-search.min.js"></script>
            <script>

                var client = AzureSearch({
                  url: "https://frankysnotes.search.windows.net",
                  key:"DB7B9D1C53EC08932D8A8D5A1406D8CA" // - query only
                });

            </script>
        </body>
    </html> 
As you can see I’m creating the AzureSearch client using my query key from before. Afterwards, we create a Search function to retrieve the search criteria from the textbox and pass it to the client. A dynamic function is used as a callback that receives the parameter noteList which is an array of matching documents. We finally just need to loop through the result to build a nice output.
    function Search(){

        var _searchCriteria = $("#txtSearch").val();   
        var _objSearch = {search: _searchCriteria, $orderby:'title desc'};

        client.search('notes', _objSearch, function(err, noteList){
            var $divResult = $("div#result");
            $divResult.html( "<div class='panel-heading'><h3 class='panel-title'>" + noteList.length + " Result(s)</h3></div><div class='panel-body'>" );

            if(noteList.length > 0){

                var _strResult = "";
                _strResult = "<ul class='list-group'>";

                for(var key in noteList){
                    var fNote = noteList[key];

                    _strResult += = "<li class='list-group-item'><a href='" + fNote.url + "' target='_blank'>" + fNote.title + "</a><p>" + fNote.note + "</p></li>";
                }

                _strResult += "</ul></div>";
                $divResult.append( _strResult );
            }
      });
    }

If we put all this together, we got a nice result.

Franky's Notes Search UI

Live Demo

Conclusion


I really had a lot of fun creating these little applications. I found the client incredibly easy to use. I hope it will help you to get ideas and moving forward to use Azure Search. Any comments, suggestions and/or questions are welcome.


~ Frank Boucher


References


Reading Notes #160

multipleCloudStorageSuggestion of the week


Cloud


Programming

  • Don't Frown on CSVs - Nice post that explains why we shouldn't over look the good old CSV format.

Miscellaneous


~Frank Boucher


The making of: Franky's Notes Azure Search - part 1


For a long time now, I'm thinking about creating an API that will allow to search easily through my notes. When Azure Search came public few weeks ago, I knew it was what this project needed to come alive. In this post, I will share how I did it, and more importantly, show how incredibly easy it was to do.


What's Azure Search?


Currently in preview, Azure Search is a cloud-based search-as-a-service that provides a set of REST APIs defined in terms of HTTP requests and responses, in OData JSON format.

Getting Started


From the Azure Portal, let's create an Azure Search Service by clicking the plus button on the bottom left of the screen. Select the Search option, and fill-up the options.

Azure_portal_crete_Search_Service_2014-10-20_0931

Application to populate my Azure Search service


First, we will need some data. My weekly posts Reading Notes are generated with a Ruby script that I did few years ago. You can read more about it on First step with Ruby: Kindle Clipping Extractor. Basically, the script extracts my notes from my Kindle and build a collection of notes grouped in different categories to generate a markdown file. That can easily be done by adding a new Json output file. Here is a quick view this output.
{
  "json_class": "FrankyNotes",
  "categories": {
    "dev": [
      {
        "id": 77077357,
        "title": "Customize the MVC 5 Application Users’ using ASP.Net Identity 2.0",
        "author": "Dhananjay kumar",
        "url": "http://debugmode.net/2014/10/01/customize-the-mvc-5-application-users-using-asp-net-identity-2-0/",
        "note": "Need to get the fukk article",
        "tags": "dev,frankysnotes,readingnotes160",
        "date": "2014/10/17",
        "category": "dev"
      },
      {
        "id": 77156372,
        "title": "Custom Login Scopes, Single Sign-On, new ASP.NET Web API – updates to 
      [...]

Now that we have some data, we need to create an index and be able to add document in it. A console application will be perfect for this job. At the time of writing this post, two libraries exist to interact with the Microsoft Azure Search REST API. For this part of the project, we will use the RedDog.Search library available on Github, since it's a .Net library.

Note: To create an index or upload documents you will need an admin key.

Admin_Key

First, we need to create an Index. Let's keep it simple and just create the index with all the properties of the json object. Here the code of my function CreateNoteIndex.
public IndexManagementClient Client
{
    get
    {
        if (_client == null){
            _client = new IndexManagementClient(ApiConnection.Create("frankysnotes", "AdminKey"));
        }
        return _client;
    }
}

public async Task<string> CreateNoteIndex()
{
    var createResult = await Client.CreateIndexAsync(new Index("notes")
        .WithStringField("id", opt => opt.IsKey().IsRetrievable())
        .WithStringField("title", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("author", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("url", opt => opt.IsRetrievable().IsSearchable(false))
        .WithStringField("note", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("tags", opt => opt.IsRetrievable().IsFilterable().IsSearchable())
        .WithStringField("date", opt => opt.IsRetrievable().IsSearchable())
        .WithStringField("category", opt => opt.IsRetrievable().IsFilterable().IsSearchable())
        );
    if (createResult.IsSuccess)
    {
        return "Index Reseted successfully";
    }
}

To be able to search by note instead of by post, I decided to break down the file in multiple documents containing one note by document. After what, it was really easy to upload the documents into the index.
public async Task<string> AddNotes(string filepath)
{
    var docs = new List<IndexOperation>();
    FrankysNotes notes = DeserializeFNotes(filepath);

    foreach (var category in notes.categories)
    {
        foreach (var fNote in notes.categories[category])
        {
            var doc = ConvertfNote(fNote);
            docs.Add(doc);
        }
    }

    var result = await Client.PopulateAsync("notes", docs.ToArray<IndexOperation>());

    return "File uploaded successfully";
}


private FrankysNotes DeserializeFNotes(string filepath)
{
    var jsonStr = File.ReadAllText(filepath);
    var serializer = new JavaScriptSerializer();

    var notes = serializer.Deserialize<FrankysNotes>(jsonStr);
    return notes;
}

private IndexOperation ConvertfNote(FrankysNote fnote)
{
    var doc = new IndexOperation(IndexOperationType.Upload, "id", fnote.id)
                    .WithProperty("title", fnote.title)
                    .WithProperty("author", fnote.author)
                    .WithProperty("url", fnote.url)
                    .WithProperty("note", fnote.note)
                    .WithProperty("tags", fnote.tags)
                    .WithProperty("date", fnote.date)
                    .WithProperty("category", fnote.category);
    return doc;
}

To keep the code as clear as possible, I removed all validations and error management. The json file is deserialized, then looping through all notes I build a list of IndexOperation. And Finally I upload all the notes with Client.PopulateAsync("notes", docs.ToArray<IndexOperation>());

Wrapping up


Using the RedDog.Search library to push documents in Azure Search Index was extremely easy. In fact, it's that simplicity that pushed me to share my discovery. In the next part of the series, I will create a simple HTML page to do real query.

Stay tune...

~ Frank Boucher

References

Reading Notes #159

AzureConf2014Suggestion of the week

 

Cloud

 

Programming

 

Database

 

Miscellaneous

~Frank Boucher

Reading Notes #158

ButtonsamplepngSuggestion of the week


Cloud

  • Designing for Big Scale in Azure (K. Dotchkoff) - Nice post that explains how we should change our design in the cloud and use logical container or scale units.

Programming

  • Building a Better NuGet (Edward Charbeneau) - Nice post that gives us the best practices when developing a NuGet package.

Miscellaneous


~Frank


Why I switch to Markdown

The Markdown is not a new video game, but a way to write in plain text that can easily be converted in HTML. This "new" standard is gaining in popularity for many reasons. In this post, I will explain why I like it and show you some basic syntax, and nice tools.

What is Markdown

The exact Markdown's definition can by found on the Markdown website and look like this:
Markdown is a text-to-HTML conversion tool for web writers. Markdown allows you to write using an easy-to-read, easy-to-write plain text format, then convert it to structurally valid HTML.
The syntax in Markdown is very easy to learn. In fact, it will come mostly by itself since it will look nice in any text editor.  For example, title and subtitle and list look like:

Title and sub-title
This Is My Title
================

Subtitle 1
----------

Here some items:
- Item 1
- item 2
- item 3

That easy right?! Let's add two more sample, this time a bit more "complex": Links and images.

Link

For links, two styles are possible:

Inline:

[Link Text](http://www.frankysnotes.com)
Reference:

[Link Text][1]
[Another one][link2]

...

[1]: http://www.frankysnotes.com 
[link2]: http://www.frankysnotes.com    

I personally prefer the reference style, because it keeps the text clean and easy to read. To add an image it's mostly the same two style again, but we add an exclamation point in front of the square brackets.

Image
![alternative text](http://frankysnotes.com/images/logo.png)

![alternative text][logo]

...

[logo]:http://frankysnotes.com/images/logo.png

This these simple things cover mostly everything we need when writing documentation, blog posts or reference documents. Obviously if you need more you can always go on the Markdown website. I also put online this full article in Markdown format.


Why Markdown is so nice

First, I really like Mardown because I can edit my files on all platforms. Since they are regular text files, any text editor on Android, IOS, Windows Phone, PC and Linux will do the job perfectly. Plus, your text will never lose is formatting, while changing from a device to another one (like with Word documents).
Likewise, since I'm working on different devices, I usually put my file in a shareable place like Dropbox or OneDrive. A simple text file is very small and quick to synchronize.

Tools, apps and more

Yes, you can edit your file in any text editor, but here are some nice tools that are available that will improve your experience.

MarkdownPad

MarkdownPad is a full-featured Markdown editor for Windows. It`s available in a free and pro version. Some interesting features are:
  • Instant HTML Preview
  • Easy Markdown formatting with keyboard shortcuts
  • Spell check
  • Use your own CSS
  • HTML and PDF Export
Website: http://markdownpad.com/


Denote
Denote is a Markdown text editor for Android that provides effortless syncing with Dropbox. Files created with Denote are saved as text (.txt) files.
  • Live preview for Markdown and HTML
  • Cloud based: Denote stores all its data in a subfolder on your personal Dropbox account so you can access it via your Mobile devices, Mac or PC
  • Offline support: changes are synced with Dropbox next time you're connected
  • Email files created in Denote
  • Customize font size and type face used for notes
Website: http://www.2storks.com/denote


Atom
This great text editor from GitHub has a nice Markdown Preview package, that will convert the markdow in HTML.
Atom Website: https://atom.io/
Markdown Preview package: https://github.com/atom/markdown-preview


Sublime Text
Sublime text is a well knowed text editor in the developer community and many different packages are also available.
Sublime Text Website: https://atom.io/
Markdown Preview package: https://github.com/revolunet/sublimetext-markdown-preview

In conclusion

I hope this post will motivate you to give it a try. Thanks for reading. Any comments, suggestions and/or questions are welcome.


~Frank Boucher