Showing posts with label powerbi. Show all posts
Showing posts with label powerbi. Show all posts

Secure a Asp.Net MVC multi-tenant Power Bi Embedded hosted in an Azure WebApp

Note: This post was originally published on Microsoft MVP Award blog, as part of the Technical Tuesday series.

Power Bi gives us the possibility to create amazing reports. Even if it's great to be able to share those reports from the very secure Power Bi portal sometimes we need to share them inside other applications or websites. Once again, Power BI doesn't disappoint us by providing Power BI Embedded. In this post, I will explain how to use Power Bi Embedded and make it secure so each tenant can only his data.

The Problem

Despite many online exist that explain how to use filters to change the witch is visible in our reports, filters can easily be changed by the user. Even if you hide the filter panel, those setting could easily be modified using JavaScript... Therefor, it's definitely not the best way to secure private information.

The Solution

In this post, I will be using roles to limit the access the data. The well knew the database Adventure Works will be used to demonstrate how to partition the data. In this case will be using the customer table.

In Azure

Open the Azure portal to create a Power BI Embedded component. Of course in a real project, it would be better to create it in an Azure Resource Management (ARM) template, but to keep this post simple we will create it with the portal. Click on the big green "+" at the top left corner. In the search box type powerbi, and hit Enter. Select Power BI Embedded in the list and click the Create button. Once it's created go to the Access Keys property of the brand-new Power BI Workspace Collection and take note of Key. We will need that key later to upload our Power BI report.

CreateWorkSpaceCollection

For this demo, the data source will be Adventure Works in an Azure Database. To do it simply click again the "+" button and select Database. Be sure to select Adventure Works as the source if to reproduce this demo.

createDB


In Power BI Desktop

Power BI Desktop is a free tool from Microsoft that will help us to create our report; it can be download here.
Before we get started, two options need to be modified. Go in the File menu and select Options and Settings, then Options. The first onr, is in the section (tab) Preview Features; check the option: Enable cross filtering in both direction for DirectQuery. The second is in the section DirectQuery, check the option Allow unrestricted measures in DirectQuery mode. It's a good idea to restart Power BI Desktop before continuing.

powerbioptions

To create our reports we first need to connect to our datasource, in this case our Azure Database. Click the Get Data button, then Azure and after that Microsoft Azure SQL Database. It's important to be attentive on the type of connection Import or Direct Query, because you won't be able to change it after. You will need to rebuild your report from scratch. For this case select DirectQuery.
This chart will be displaying information about invoice detail. Be sure to include the table that will be used for your role. In this case, I will be using Customer. Each customer must see only their invoices.

 tables

The report will contain two charts: the left one is a bar chart where you see the invoice historic, the right one is a pie chart that shows how products in the invoice(s) are distributed by category.
Note: in the sample database all customer have only one invoice and hey are all at the same date

chart_noRole

Now we need to create our dynamic Role. In the Modeling tab click on Manage Roles and create a CustomerRole mapping the CompanyName of the customer table to the variable USERNAME()

genericRole

Of course, to test if our charts are really dynamics, create other roles, and give them specific values ex: "Bike World" or "Action Bicycle Specialists". To visualize your report as those user, simply click on the View as Roles, in the Modeling tab, and select the role you want.

ViewAs

See how the charts look when see from "Action Bicycle Specialists".

chart_withRole

The report is now ready. Save it and we will need it soon.


Powerbi-cli

To upload our report in our Azure Workspace Collection, I like to use PowerBI-CLI because it runs everywhere, thanks to Node.js.
Open a command prompt or Terminal and execute the following command to install PowerBI-CLI:
npm install powerbi-cli -g
Now if you type 'powerbi' you should have the powerbi-cli help display.

powerbicli

It's time to use the access key we got previously, and use it in this command to create a workspace in our workspace collection.

//== Create Workspace ===========
powerbi create-workspace -c FrankWrkSpcCollection -k my_azure_workspace_collection_access_key

Now, let's upload our Power BI report into Azure. Retrieve the workspace ID returned by the previous command and pass it as the parameter -w (workspace).

//== Import ===========
powerbi import -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -f "C:\powerbidemo\CustomerInvoices.pbix" -n CustomerInvoices -o

Now we will need to update the connectionstring of our dataset. Get his ID with the following command:

//== Get-Datasets ===========
powerbi get-datasets -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key 

Now update the connectionstring, passing the datasetId with the parameter -d:

//== update-connection ===========
powerbi update-connection -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -d 01fcabb6-1603-4653-a938-c83b7c45a59c -u usename@servername -p password


In Visual Studio

All the PowerBi Embeded part is now completed. Let's create the new Asp.Net MVC Web Application. A few Nuget packages are required, be sure to have those versions or newest:
  • Microsoft.PowerBI.AspNet.Mvc version="1.1.7"
  • Microsoft.PowerBI.Core version="1.1.6"
  • Microsoft.PowerBI.JavaScript version="2.2.6"
  • Newtonsoft.Json version="9.0.1"
By default Newtonsoft.Json is already there but needs an upgrade.
Update-Package Newtonsoft.Json
And for the Microsoft.PowerBI one, an install command should take care of all the other dependencies.

Install-Package Microsoft.PowerBI.AspNet.Mvc

We also need to add all the access information we previously used in our powerbi-Cli into our application. Let's add them in the web.config.

...
<appSettings>
    <add key="powerbi:AccessKey" value="my_azure_workspace_collection_access_key" />
    <add key="powerbi:ApiUrl" value="https://api.powerbi.com" />
    <add key="powerbi:WorkspaceCollection" value="FrankWrkSpcCollection" />
    <add key="powerbi:WorkspaceId" value="01fcabb6-1603-4653-a938-c83b7c45a59c" />
</appSettings>
...

Here the code of the InvoicesController:

using System;
using System.Configuration;
using System.Linq;
using System.Web.Mvc;
using demopowerbiembeded.Models;
using Microsoft.PowerBI.Api.V1;
using Microsoft.PowerBI.Security;
using Microsoft.Rest;
namespace demopowerbiembeded.Controllers
{
    public class InvoicesController : Controller
    {
        private readonly string workspaceCollection;
        private readonly string workspaceId;
        private readonly string accessKey;
        private readonly string apiUrl;
        public InvoicesController()
        {
            this.workspaceCollection = ConfigurationManager.AppSettings["powerbi:WorkspaceCollection"];
            this.workspaceId = ConfigurationManager.AppSettings["powerbi:WorkspaceId"];
            this.accessKey = ConfigurationManager.AppSettings["powerbi:AccessKey"];
            this.apiUrl = ConfigurationManager.AppSettings["powerbi:ApiUrl"];
        }
        private IPowerBIClient CreatePowerBIClient
        {
            get
            {
                var credentials = new TokenCredentials(accessKey, "AppKey");
                var client = new PowerBIClient(credentials)
                {
                    BaseUri = new Uri(apiUrl)
                };
                return client;
            }
        }
        public ReportViewModel GetFilteredRepot(string clientName)
        {
            using (var client = this.CreatePowerBIClient)
            {
                var reportsResponse = client.Reports.GetReportsAsync(this.workspaceCollection, this.workspaceId);
                var report = reportsResponse.Result.Value.FirstOrDefault(r => r.Name == "CustomerInvoices");
                var embedToken = PowerBIToken.CreateReportEmbedToken(this.workspaceCollection, this.workspaceId, report.Id, clientName, new string[] { "CustomerRole" });
                var model = new ReportViewModel
                {
                    Report = report,
                    AccessToken = embedToken.Generate(this.accessKey)
                };
                return model;
            }
        }
        public ActionResult Index()
        {
            var report = GetFilteredRepot("Action Bicycle Specialists");
            return View(report);
        }
    }
}

The interesting part of this controller is in the method GetFilteredRepot. First, it gets all the reports from our workspaces than look for the one named: "CustomerInvoices". The next step is where the loop gets closed; it creates the token. Of course, we pass the workspacecollection, workspace and report references, and that could be it. I mean passing only those references would result to our reports where all customers were displayed... But obviously that not what we want right now. The two last parameters are username and an Array of roles. When we created roles in Power BI Desktop, we created one call CustomerRole that was equal to the variable USERNAME(). So here we will pass the client name as username and specify that we want to use the role "CustomerRole".
Last piece to the puzzle is the View, so let add one.

@model demopowerbiembeded.Models.ReportViewModel
<style>iframe {border: 0;border-width: 0px;}</style>
<div id="test1" style="border-style: hidden;">
    @Html.PowerBIReportFor(m => m.Report, new { id = "pbi-report", style = "height:85vh", powerbi_access_token = Model.AccessToken })
</div>
@section scripts
{
    <script src="~/Scripts/powerbi.js"></script>
    <script>
        $(function () {
            var reportConfig = {
                settings: {
                    filterPaneEnabled: false,
                    navContentPaneEnabled: false
                }
            };
            var reportElement = document.getElementById('pbi-report');
            var report = powerbi.embed(reportElement, reportConfig);
        });
    </script>
}

One great advantage of using Asp.Net MVC is that we have an @Html.PowerBIReportFor at our disposal. Then we can instantiate the report with the call of powerbi.embed(reportElement, reportConfig);. Where I pass some configuration to remove the navigation, and the filter panes, but that optional.

Now if we run our project, you should have a result looking like that.

finalresult


Wrap it up

Viola! This of course was a demo and should be optimized. Please leave a comment if you have any questions, or don't hesitate to contact me. It's always great to chat with you.


References:



Reading Notes #253

2016-10-17_09-17-05Suggestion of the week


Cloud


Programming


Databases



Reading Notes #248

imageProgramming


Databases


Miscellaneous


Reading Notes #245

Cloud


Miscellaneous


Reading Notes #242

mapCloud


Programming

  • Exploring dotnet new with .NET Core (Scott Hanselman) - I discover the different types in dotnet new command during Julie Lerman's talk at DevTeach and now this post shows a list of incredible opportunities.

Miscellaneous


Reading Notes #238

docker_ascii_artCloud


Programming


Miscellaneous

  • Back to my core - In this post, Darrel shares the beginning of a new adventure... Congradulation to you and looking forward to reading again from you. Microsoft gained a really good developer.


Reading Notes #237

2016-06-20_06-39-16Cloud


Programming


Miscellaneous



Reading Notes #231

tulipSuggestion of the week

  • Introducing docs.microsoft.com - Great post that explains all the nice features of the first glimpse of the new Microsoft documentation. They did a fantastic work, a post to read; a site to remember.

Cloud


Programming


Data


Miscellaneous




Reading Notes #229

NewLogicAppDesignerCloud


Data


Programming

  • Become a Visual Studio 2015 Power User (Allison Buchholtz-Au, Andrew Hall) - All Visual Studio developers must watch this video, big chance you had a d in your hand and didn't know about it.

Miscellaneous

  • It's a new blog! (Troy Hunt) - A delicious post, thanks Troy to share that with us.


Reading Notes #228

IMG_20160416_085010Cloud

  • But Why Do You Trust Your Data? (Buck Woody) - I consider this post as a really great teaser. After ready his post most chances are you will check the video it recommended... and next give a shot to Azure data Catalog.

Programming


Data

  • Analyzing your Azure Search traffic (Berni Torres Garayar) - Great post that explains how to improve our services by listening to our client's search requests, leveraging the new Azure Search Analytics and PowerBi.

Miscellaneous


~Frank


Reading Notes #221

logo_JavaScriptSuggestion of the week

  • Why You Should Learn JavaScript in 2016 (Ken Powers) - I eared a lot of people complaining about Javascript, this excellent post explains why you undeniably, we should all know it, an if it's not the case why 2016 is a great time to learn it.

Cloud


Programming


Data

  • Power BI Service February Update (Amanda Cofsky) - Fantastic! This update will give us the possibility to share outside ou organization... And many other things.

Miscellaneous


~Frank


Reading Notes #215

Reading on the roadCloud


Programming


Data


Miscellaneous



Reading Notes #214

Suggestion of the week

  • Express - This is the perfect post to get started with node.js with Azure. This post starts with you step by step from a vanilla computer running OS X or Linux to your first App.

Cloud


Programming


Data


Miscellaneous



PowerBI and Microsoft Azure Consumption

Recently, I needed to check and compare Azure consumption for a client. What a repetitive task: download the csv files from the Azure billing portal, open it in Excel to clean/merge/customize it… Would it be great if it could be easily done? Well, it is! Power BI is the perfect tool to do that (and a lot more).  In this post, I will explain how I created my Power Query and solved different problem I encountered in my journey.

The Goal


I want PowerBI to read (dynamically) all csv files in a folder and updates all my charts and graph, so I can share them easily why my clients.

The Tools


To create Power Queries, you can use the new Power BI Desktop available online for free or Excel. With Excel 2016, the Power query editor tools is included, for the previous version you need to install the Microsoft Power Query for Excel add-in. In both cases, many tutorials explain how to get started with these tools (see the references at the end of this post).

The Problem


Creating our query should be pretty straight forward, since we can create a Power Query by selecting a folder as a source.
Import_auto_csv
The problem is that our file contains three types of records: Provisioning Status, Statement, and Daily Usage. These “tables” are very different and don’t have the same number of columns. This is why when we try to merge them; we got some Error.

Expend_all_fail
Error_Auto_import

The Solution


The way to solve this problem is to create a function that will parse one file to extract one recordset, and call that function for all the file in the folder.

Note:
The simplest way to get started is to work with one file, then convert it to a function. The way to that is to replace the path of the csv file by a variable that will be passed as a parameter: (filePath) =>.
To keep the code as simple as possible, I kept the function and the looping query separated, but they can be merged in only query.

Extract “Daily Usage”


Here are the queries to extract the Daily Usage (third recordSet) from the csv file and some code description.
 // -- fcnCleanOneCSV_v2 ----------------------------------------

(filePath) =>
let
   fnRawFileContents = (fullpath as text) as table =>
let
   Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,

   Source = fnRawFileContents(filePath),
   #"Daily Usage Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Daily Usage")),
   #"DailyPosition" = Table.PositionOf(Source, #"Daily Usage Row" {0}),
   #"TopRemoved" = Table.Skip(Source, (DailyPosition + 1)),
   #"Result" = Table.PromoteHeaders(TopRemoved)
in 
   Result
The first part is to load the content of the file as a one column table. Then DailyPosition is used to store the position where Daily Usage data starts. This value is used in Table.Skip(Source, (DailyPosition + 1)) to keep only the rows after, since Daily usage is the last recordSet it works perfectly.
 //== Process Folder CSV_v2 for Daily Usage==============================

let
   Source = Folder.Files("C:\Azure_Consumption_demo\CSV_v2\"),
   MergedColumns = Table.CombineColumns(Source,{"Folder Path", "Name"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"Merged"),
   RemovedOtherColumns = Table.SelectColumns(MergedColumns,{"Merged"}),
   #"Results" = Table.AddColumn(RemovedOtherColumns , "GetCsvs", each fcnCleanOneCSV_v2([Merged])),
   #"Removed Columns" = Table.RemoveColumns(Results,{"Merged"}),
   #"Expanded GetCsvs" = Table.ExpandTableColumn(#"Removed Columns", "GetCsvs", {"Usage Date,Meter Category,Meter Id,Meter Sub-category,Meter Name,Meter Region,Unit,Consumed Quantity,Resource Location,Consumed Service,Resource Group,Instance Id,Tags,Additional Info,Service Info 1,Service Info 2"}, {"Usage Date,Meter Category,Meter Id,Meter Sub-category,Meter Name,Meter Region,Unit,Consumed Quantity,Resource Location,Consumed Service,Resource Group,Instance Id,Tags,Additional Info,Service Info 1,Service Info 2"}),


   #"Demoted Headers" = Table.DemoteHeaders(#"Expanded GetCsvs"),
   #"Split Column by Delimiter" = Table.SplitColumn(#"Demoted Headers","Column1",Splitter.SplitTextByDelimiter(","),{"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8", "Column1.9", "Column1.10", "Column1.11", "Column1.12", "Column1.13", "Column1.14", "Column1.15", "Column1.16"}),
   #"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}, {"Column1.4", type text}, {"Column1.5", type text}, {"Column1.6", type text}, {"Column1.7", type text}, {"Column1.8", type text}, {"Column1.9", type text}, {"Column1.10", type text}, {"Column1.11", type text}, {"Column1.12", type text}, {"Column1.13", type text}, {"Column1.14", type text}, {"Column1.15", type text}, {"Column1.16", type text}}),
   #"Promoted Headers" = Table.PromoteHeaders(#"Changed Type"),
   #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Usage Date", type date}, {"Meter Region", type text}}),
   #"Replaced Value" = Table.ReplaceValue(#"Changed Type1","""","",Replacer.ReplaceText,{"Meter Category", "Meter Id", "Meter Sub-category", "Meter Name", "Meter Region", "Unit", "Resource Location", "Consumed Service", "Instance Id", "Tags", "Additional Info", "Service Info 1", "Service Info 2"}),
   #"Changed Type2" = Table.TransformColumnTypes(#"Replaced Value",{{"Consumed Quantity", type number}})
in
  #"Changed Type2"
From row 1 to 6, we get all the file in the folder then combine columns to get a full path for each file. We then pass that to our function previously defined. With the command Table.SplitColumn, on line 11, we re-built the result as a table with multiple columns.
The rest of the query is to clean-up the result by changing the column’s type or removing undesired character.


Extract “Statement”


To get the Statement recordSet, it’s the same thing except that we will Table.Range, since the rows that we are looking for are between Provisioning Status and Daily Usage.
//== fcnGetStatement ========================================== 

(filePath) =>
let
   fnRawFileContents = (fullpath as text) as table =>
let
   Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,

    Source = fnRawFileContents(filePath),
    #"Daily Usage Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Daily Usage")),
    #"DailyPosition" = Table.PositionOf(Source, #"Daily Usage Row" {0}),
    #"Statement Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Statement")),
    #"StatementPosition" = Table.PositionOf(Source, #"Statement Row" {0}),
    #"SelectedRows" = Table.Range(Source,(StatementPosition+1),(DailyPosition - StatementPosition )-2),
    #"Result" = Table.PromoteHeaders(SelectedRows)
in
    Result
And once again we loop through every file and do some clean-up.
//== Query Statements ========================================

let
    Source = Folder.Files("C:\Azure_Consumption_demo\CSV_v2\"),
    MergedColumns = Table.CombineColumns(Source,{"Folder Path", "Name"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"Merged"),
    RemovedOtherColumns = Table.SelectColumns(MergedColumns,{"Merged"}),
    #"Results" = Table.AddColumn(RemovedOtherColumns , "GetCsvs", each fcnGetStatement([Merged])),
    #"Removed Columns" = Table.RemoveColumns(Results,{"Merged"}),
    #"Expanded GetCsvs" = Table.ExpandTableColumn(#"Removed Columns", "GetCsvs", {"Billing Period,Meter Category,Meter Sub-category,Meter Name,Meter Region,SKU,Unit,Consumed Quantity,Included Quantity,Within Commitment,Overage Quantity,Currency,Overage,Commitment Rate,Rate,Value"}, {"Billing Period,Meter Category,Meter Sub-category,Meter Name,Meter Region,SKU,Unit,Consumed Quantity,Included Quantity,Within Commitment,Overage Quantity,Currency,Overage,Commitment Rate,Rate,Value"}),


    #"Demoted Headers" = Table.DemoteHeaders(#"Expanded GetCsvs"),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Demoted Headers","Column1",Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv),{"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8", "Column1.9", "Column1.10", "Column1.11", "Column1.12", "Column1.13", "Column1.14", "Column1.15", "Column1.16"}),
    #"Promoted Headers" = Table.PromoteHeaders(#"Split Column by Delimiter"),
    #"Replaced Value" = Table.ReplaceValue(#"Promoted Headers","""","",Replacer.ReplaceText,{"Meter Category", "Meter Sub-category", "Meter Name", "Meter Region", "SKU", "Unit"})
in
    #"Replaced Value"

Once all that is done… Now the fun can begin!




References




Reading Notes #210

2015-11-22_2132Suggestion of the week


Cloud


Databases


    Reading Notes #209

    Image result for redhat

    Cloud


    Databases


    Programming




    Reading Notes #204

    AzureConLabsSuggestion of the week


    Cloud


    Programming


    Databastes


    Miscellaneous

    • Going Back to One (Alexandre Brisebois) - Organize our work to become a performer, could be easily done in Windows 10.
    • Static Site or CMS? - (Brian Rinaldi) - Nice post that gives insights to answer one of the most frequent questions when people start a blog/website.


    Reading Notes #203


    AzureConScott

     

     

    Suggestion of the week


    Cloud


    Programming


    Databastes


    Miscellaneous

    • Going Back to One (Alexandre Brisebois) - Organize our work to become a performer, could be easily done in Windows. 10.

    ~Frank 



    Reading Notes #195

    VS2015_2015-07-27_0945Suggestion of the week


    Cloud


    Database


    Programming


    ~ Frank