PowerBI and Microsoft Azure Consumption

Recently, I needed to check and compare Azure consumption for a client. What a repetitive task: download the csv files from the Azure billing portal, open it in Excel to clean/merge/customize it… Would it be great if it could be easily done? Well, it is! Power BI is the perfect tool to do that (and a lot more).  In this post, I will explain how I created my Power Query and solved different problem I encountered in my journey.

The Goal


I want PowerBI to read (dynamically) all csv files in a folder and updates all my charts and graph, so I can share them easily why my clients.

The Tools


To create Power Queries, you can use the new Power BI Desktop available online for free or Excel. With Excel 2016, the Power query editor tools is included, for the previous version you need to install the Microsoft Power Query for Excel add-in. In both cases, many tutorials explain how to get started with these tools (see the references at the end of this post).

The Problem


Creating our query should be pretty straight forward, since we can create a Power Query by selecting a folder as a source.
Import_auto_csv
The problem is that our file contains three types of records: Provisioning Status, Statement, and Daily Usage. These “tables” are very different and don’t have the same number of columns. This is why when we try to merge them; we got some Error.

Expend_all_fail
Error_Auto_import

The Solution


The way to solve this problem is to create a function that will parse one file to extract one recordset, and call that function for all the file in the folder.

Note:
The simplest way to get started is to work with one file, then convert it to a function. The way to that is to replace the path of the csv file by a variable that will be passed as a parameter: (filePath) =>.
To keep the code as simple as possible, I kept the function and the looping query separated, but they can be merged in only query.

Extract “Daily Usage”


Here are the queries to extract the Daily Usage (third recordSet) from the csv file and some code description.
 // -- fcnCleanOneCSV_v2 ----------------------------------------

(filePath) =>
let
   fnRawFileContents = (fullpath as text) as table =>
let
   Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,

   Source = fnRawFileContents(filePath),
   #"Daily Usage Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Daily Usage")),
   #"DailyPosition" = Table.PositionOf(Source, #"Daily Usage Row" {0}),
   #"TopRemoved" = Table.Skip(Source, (DailyPosition + 1)),
   #"Result" = Table.PromoteHeaders(TopRemoved)
in 
   Result
The first part is to load the content of the file as a one column table. Then DailyPosition is used to store the position where Daily Usage data starts. This value is used in Table.Skip(Source, (DailyPosition + 1)) to keep only the rows after, since Daily usage is the last recordSet it works perfectly.
 //== Process Folder CSV_v2 for Daily Usage==============================

let
   Source = Folder.Files("C:\Azure_Consumption_demo\CSV_v2\"),
   MergedColumns = Table.CombineColumns(Source,{"Folder Path", "Name"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"Merged"),
   RemovedOtherColumns = Table.SelectColumns(MergedColumns,{"Merged"}),
   #"Results" = Table.AddColumn(RemovedOtherColumns , "GetCsvs", each fcnCleanOneCSV_v2([Merged])),
   #"Removed Columns" = Table.RemoveColumns(Results,{"Merged"}),
   #"Expanded GetCsvs" = Table.ExpandTableColumn(#"Removed Columns", "GetCsvs", {"Usage Date,Meter Category,Meter Id,Meter Sub-category,Meter Name,Meter Region,Unit,Consumed Quantity,Resource Location,Consumed Service,Resource Group,Instance Id,Tags,Additional Info,Service Info 1,Service Info 2"}, {"Usage Date,Meter Category,Meter Id,Meter Sub-category,Meter Name,Meter Region,Unit,Consumed Quantity,Resource Location,Consumed Service,Resource Group,Instance Id,Tags,Additional Info,Service Info 1,Service Info 2"}),


   #"Demoted Headers" = Table.DemoteHeaders(#"Expanded GetCsvs"),
   #"Split Column by Delimiter" = Table.SplitColumn(#"Demoted Headers","Column1",Splitter.SplitTextByDelimiter(","),{"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8", "Column1.9", "Column1.10", "Column1.11", "Column1.12", "Column1.13", "Column1.14", "Column1.15", "Column1.16"}),
   #"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}, {"Column1.4", type text}, {"Column1.5", type text}, {"Column1.6", type text}, {"Column1.7", type text}, {"Column1.8", type text}, {"Column1.9", type text}, {"Column1.10", type text}, {"Column1.11", type text}, {"Column1.12", type text}, {"Column1.13", type text}, {"Column1.14", type text}, {"Column1.15", type text}, {"Column1.16", type text}}),
   #"Promoted Headers" = Table.PromoteHeaders(#"Changed Type"),
   #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Usage Date", type date}, {"Meter Region", type text}}),
   #"Replaced Value" = Table.ReplaceValue(#"Changed Type1","""","",Replacer.ReplaceText,{"Meter Category", "Meter Id", "Meter Sub-category", "Meter Name", "Meter Region", "Unit", "Resource Location", "Consumed Service", "Instance Id", "Tags", "Additional Info", "Service Info 1", "Service Info 2"}),
   #"Changed Type2" = Table.TransformColumnTypes(#"Replaced Value",{{"Consumed Quantity", type number}})
in
  #"Changed Type2"
From row 1 to 6, we get all the file in the folder then combine columns to get a full path for each file. We then pass that to our function previously defined. With the command Table.SplitColumn, on line 11, we re-built the result as a table with multiple columns.
The rest of the query is to clean-up the result by changing the column’s type or removing undesired character.


Extract “Statement”


To get the Statement recordSet, it’s the same thing except that we will Table.Range, since the rows that we are looking for are between Provisioning Status and Daily Usage.
//== fcnGetStatement ========================================== 

(filePath) =>
let
   fnRawFileContents = (fullpath as text) as table =>
let
   Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,

    Source = fnRawFileContents(filePath),
    #"Daily Usage Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Daily Usage")),
    #"DailyPosition" = Table.PositionOf(Source, #"Daily Usage Row" {0}),
    #"Statement Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Statement")),
    #"StatementPosition" = Table.PositionOf(Source, #"Statement Row" {0}),
    #"SelectedRows" = Table.Range(Source,(StatementPosition+1),(DailyPosition - StatementPosition )-2),
    #"Result" = Table.PromoteHeaders(SelectedRows)
in
    Result
And once again we loop through every file and do some clean-up.
//== Query Statements ========================================

let
    Source = Folder.Files("C:\Azure_Consumption_demo\CSV_v2\"),
    MergedColumns = Table.CombineColumns(Source,{"Folder Path", "Name"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"Merged"),
    RemovedOtherColumns = Table.SelectColumns(MergedColumns,{"Merged"}),
    #"Results" = Table.AddColumn(RemovedOtherColumns , "GetCsvs", each fcnGetStatement([Merged])),
    #"Removed Columns" = Table.RemoveColumns(Results,{"Merged"}),
    #"Expanded GetCsvs" = Table.ExpandTableColumn(#"Removed Columns", "GetCsvs", {"Billing Period,Meter Category,Meter Sub-category,Meter Name,Meter Region,SKU,Unit,Consumed Quantity,Included Quantity,Within Commitment,Overage Quantity,Currency,Overage,Commitment Rate,Rate,Value"}, {"Billing Period,Meter Category,Meter Sub-category,Meter Name,Meter Region,SKU,Unit,Consumed Quantity,Included Quantity,Within Commitment,Overage Quantity,Currency,Overage,Commitment Rate,Rate,Value"}),


    #"Demoted Headers" = Table.DemoteHeaders(#"Expanded GetCsvs"),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Demoted Headers","Column1",Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv),{"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8", "Column1.9", "Column1.10", "Column1.11", "Column1.12", "Column1.13", "Column1.14", "Column1.15", "Column1.16"}),
    #"Promoted Headers" = Table.PromoteHeaders(#"Split Column by Delimiter"),
    #"Replaced Value" = Table.ReplaceValue(#"Promoted Headers","""","",Replacer.ReplaceText,{"Meter Category", "Meter Sub-category", "Meter Name", "Meter Region", "SKU", "Unit"})
in
    #"Replaced Value"

Once all that is done… Now the fun can begin!




References




Reading Notes #210

2015-11-22_2132Suggestion of the week


Cloud


Databases


    Reading Notes #209

    Image result for redhat

    Cloud


    Databases


    Programming




    Reading Notes #208

    CanMVP

    Suggestion of the week


    Cloud


    Programming


    Miscellaneous




    Reading Notes #207

    msdnmagSuggestion of the week

    • A Beginner’s Mind - A very inspiring article, especially for the younger, but also for the more experienced, that want to keep their interior flame.

    Cloud


    Databases


    Programming


    Miscellaneous



    Reading Notes #206

    2015-10-18_2050Cloud


    Programming

    • Learn You Node with VS Code (G. Andrew) - This post is really an excellent starting point to learn Node.js. It gives good references, tools, and tips.

    Miscellaneous

    • MVP Award Update - Oct 2015 - This post explains the changes done to the MVP program to improve it. A must to all current and future MVP candidates.


    What you shouldn't have missed in the last few weeks


    2015-10-15_0836In September, Microsoft did so many announcements it’s difficult to keep tracks of all. This short post he just to list the most important and to give you the opportunity to watch them again or the first time.

    AzureCon 2015

    The AzureCon was a virtual event on September 29 that was focusing on Microsoft Azure. Many View Party around the global was watching this even rich in announcements. You can watch here the AzureCon keynotes online.
    But AzureCon was not only keynotes, it was more than 50 technical sessions covering every Azure’s feature. Get the the full list on Channel9 here.

    Windows 10 Devices

    2015-10-15_0837At the beginning of October the Windows 10 Devices was a really amazing event. Microsoft was showing us all is new devices, and they were a lot! Don’t trust me, go see by yourself on this blog post by Terry Myerson.

     

    What’s new

    Get Started and deploy your first cloud solution in under 5 minutes. Find tons of short videos online that teach you how to quicky enjoy the power of Azure.



    ~Frank


    Reading Notes #205

    background_clickCloud


    Programming


    Miscellaneous


    ~Frank



    Reading Notes #204

    AzureConLabsSuggestion of the week


    Cloud


    Programming


    Databastes


    Miscellaneous

    • Going Back to One (Alexandre Brisebois) - Organize our work to become a performer, could be easily done in Windows 10.
    • Static Site or CMS? - (Brian Rinaldi) - Nice post that gives insights to answer one of the most frequent questions when people start a blog/website.


    Reading Notes #203


    AzureConScott

     

     

    Suggestion of the week


    Cloud


    Programming


    Databastes


    Miscellaneous

    • Going Back to One (Alexandre Brisebois) - Organize our work to become a performer, could be easily done in Windows. 10.

    ~Frank 



    What is an AzureCon View Party?

    azureCon-Be_the_first

    First what is AzureCon?


    In less than a week Microsoft is doing a great event called AzureCon. This event is a virtual conference that will focus on Microsoft Azure. It is a virtual event because it's happening online. Even more, it will be available to watch it live for free! The lineup as been published and four great speaker will share with us the latest news about Azure.

    AzureCon_speakers

    What is a View Party?

    A View Party is the chance to watch live the same content of all other, but in a group. It's an opportunity to ask your question while it's happening and gets answers from the MVPs or other viewers.

    Where are those View Party?

    By the time I'm writing this post, I don't know all of them, but please sharing is good, so if you know a view party is happening in your area share the info using the comment session. You could also send me an e-mail, and I will update this post. I will be at Ottawa, looking forward to meeting you there!
    • Montreal
      MsDevMtl Community
      2000 McGill College, 5e étage, Montréal, QC, Montréal, QC
      Meetup
    • Ottawa
      Ottawa IT Community
      100 Queen Street, Suite 500 , Ottawa, ON
      Meetup


    Reading Notes #202

     

    Azure automationSuggestion of the week


    Cloud


    Programming


    Miscellaneous



    Reading Notes #201

    balanceCloud

    • Tracing and logging with Application Insights (Andrei Dzimchuk) - You know the 101 about App Insights and you are looking for something more specific? This post if a must it shows how to transform an ordinary logger in a great source of information.

    Programming


    Databases


    Miscellaneous



    Reading Notes #200

    2015-09-06_2133Suggestion of the week


    Cloud


    Databases


    Books

    • Microsoft Azure Essentials Fundamentals of AzureMicrosoft Azure Essentials_ Fundamentals of Azure (Michael S. Collier & Robin E. Shahan) - This week a felt like returning to the sources, and read the essentials. Difficult to summarize a summary, but all principal family of features are approached and the “must have” tools are over… Even some scenarios. A really good book, evenmore it’s for all.

    Reading Notes #199

    ElasticDBSuggestion of the week

    • First look at Application Insights (Andrei Dzimchuk) - Excellent post. Perfect to make your idea about why and how we should use Application Insights. Nice touch to the config.

    Cloud


    Programming

    Databases

    Miscellaneous



    Reading Notes #198

    P1020009Cloud


    Podcast

    • .NET Rocks! - Great episode very interesting discussion about the new Azure Service Fabric.

    Programming


    Databases



    Get more from Azure application Insights

    Application Insights is an incredible tool that will brings to you a gigantic quantity of information about your application. In a previous post: Tons statistics and metrics for Microsoft Azure websites (not only in Asp.Net), I was explaining how to get started with App Insights. In this post, I want to explain how you can create your own customs events or metrics.

    The setup


    I will use an Asp.Net MVC named SimpleWebApp that will consume an Azure Api App named FrankApiApp. For the demo purposes I will have both projects in the same solution, but let's pretend that the ApiApp is not us, and that we don't control what inside.

    To create an Azure Api App you will need Visual Studio 2015 or Visual Studio 2013 with minimum Azure SDK 2.6 installed. When creating the app, select Asp.Net Web Application, then Azure Api App (Preview).

    Azure_API_App

    For this post, we simply need one function. Here the code of the Get method that will wait and return us a string:
        // GET api/values
        public IEnumerable<string> Get()
        {
            var rnd = new Random();
            var waitFor = rnd.Next(500, 3000);
    
            System.Threading.Thread.Sleep(waitFor);
            var msg = String.Format("You have wait {0}. Sorry about that.", waitFor);
    
            return new string[] { waitFor.ToString(), msg };
        }

    The application where telemetries will be added is a simple Asp.Net Web Application with MVC, created with the default template in Visual Studio. To add App Insights, just right-click on the project and select Add Application Insights Telemetry.

    Add_Azure_API_App_CLient

    Now we need to add a reference of the Azure API App. That could be easily done. Right-click, once again, on the project and Select Add, then Azure API App Client. A new window will appear, and we only need to select our just deployed API App.

    Select_API_App[3]

    The Base


    Of course, just by adding the code spinet to the head of _Layout.cshtml you will have a lot of information about your server, your application but also the client experience and browser. This is all very useful, and it worth the time to navigate/ dig in the Azure portal to understand how your application is going/ doing.


    Custom Metric


    An interesting information could be to measure the time for each call to FrankApiApp. Let's started by adding a custom metric. To do that we will add a function in the HomeController and a new tab in that menu of the _Layout.cshtml.
    public class HomeController : Controller
    {
        private TelemetryClient appInsights = new TelemetryClient();
    
        public ActionResult ApiCall()
        {
            var stopwatch = System.Diagnostics.Stopwatch.StartNew();
            var client = new FrankApiApp();
            var result = client.Values.Get();
    
            stopwatch.Stop();
    
            ViewBag.Message = result[1];
    
            appInsights.TrackMetric("TimeDoingNothing", stopwatch.Elapsed.TotalSeconds);
    
            return View("Contact");
        }
    }   

    First, we create an instance of TelemetryClient. Then before calling the FrankApiApp's Get method we start a stopwatch. And we just need to use the TrackMetric method to push the metric.

    After deploying and navigating a little in the application, let see what's we have available for us. Open portal.azure.com, and navigate to your application Insights. Click on the Metrics Explorer, a new blade now open with three empty charts. Click on the timeline, then in the Metric texbox type "TimeDoingNothing" (new metric's name). Now, click on the grid (third chart), add the metric name and select Group By Continent. Interesting right?!

    TimeDoingNothing


    Even more


    Application Insight gives the opportunity to track many things: PageViews, Events, Exception, Requests, Trace and of course Metrics. Here some alternative of our ApiCall function using the TrackEvent and TrackRequest.
    // Using TrackEvent
    public ActionResult ApiCall()
    {
        var stopwatch = System.Diagnostics.Stopwatch.StartNew();
    
        var client = new FrankApiApp();
        var result = client.Values.Get();
        stopwatch.Stop();
    
        ViewBag.Message = result[1];
    
        var metrics = new Dictionary<string, double> {{"processingTime", stopwatch.Elapsed.TotalMilliseconds}};
        var properties = new Dictionary<string, string>  {{ "MessageSuggestedElapsed", result[0] } };
    
        // Send the event:
        appInsights.TrackEvent("APICallProcessed", properties, metrics);
        appInsights.TrackMetric("TimeDoingNothing", stopwatch.Elapsed.TotalSeconds);
    
        return View("Contact");
    }
    
    
    
    // Using TrackRequest
    public ActionResult ApiCallWithTelemetry()
    {
        appInsights.Context.Operation.Id = Guid.NewGuid().ToString();
    
        var stopwatch = System.Diagnostics.Stopwatch.StartNew();
        var client = new FrankApiApp();
        var result = client.Values.Get();
    
        stopwatch.Stop();
    
        ViewBag.Message = result[1];
    
        // Send the event:
        appInsights.TrackRequest("APICallRequest", DateTime.UtcNow, stopwatch.Elapsed, "200", true);
        appInsights.TrackMetric("TimeDoingNothing", stopwatch.Elapsed.TotalSeconds);
    
        return View("Contact");
    }



    Wrap it up


    I couldn't show everything in one post, it's up to you to create new metrics, events and query them in Azure Application Insights. Share your discovery or questions in the comment section bellow.

    References



    Reading Notes #197

    P1010915Suggestion of the week

    Cloud

    Programming

    Podcast

    Miscellaneous


    Azure SDK 2.7 my savior

    Recently, I got a very unpleasant problem: I needed to remove all my Microsoft Azure accounts from Visual Studio. An error was popping every time Visual Studio was trying to connect to Azure preventing me to anything else. It could happen to everyone, so I decide to share that story so other could save time and energy trying to fix that.

    The Context


    Long time ago I created my Azure account using my Hotmail account. I modified the administrator to my Outlook.com when it came-up (approx. two years ago). When I installed Azure SKD 2.6, the problem started.

    The Error


    In Visual Studio 2013 Update 4 or VS 2015 RTM, every time VS was trying to connect to Azure, I was receiving this error message.

    VS2013_loginError_VS2013_2015-06-27_0736
    Every time Visual Studio 2013 Update 4 or VS 2015 try to connect to Azure...


    The Fix


    If you only need Visual Studio, remove the "damaged" Azure account. If you need Visual Studio to work with Azure you have two options:

    1. Azure SDK 2.5 will work, if that is an acceptable workaround. You would need to make sure Azure SDK 2.6 was never installed on the machine, otherwise it will still have the offending dll.
    2. Install Azure SDK 2.7 and/or Visual Studio 2015. In my case, it fixed everything!

    ~Frank


    Reading Notes #196

    Win10Suggestion of the week


    Cloud


    Programming


    Miscellaneous


    ~Frank