Reading Notes #277

IMG_20170422_130532Cloud

Programming

Miscellaneous

  • Bots are the new Apps – Part 2 (Alexandre Brisebois) - This post asks a lot of questions. .Bots are could be very powerful but are they easy to build? Iterate, test, fail, learn, and try again.


Two Ways To Build a Recursive Logic App

Mostly everything is possible. It's always a question of how much time and energy we have. The other day I was building an Azure Logic App, and need it to make it recursive. First, I thought that couldn't be a problem because we can easily call a Logic APP from another one. This is right, nested Logic App is possible, but it's only possible to call another one. The "compilator" doesn't allow to do recursive call. I quickly found a workaround, but the day after I came to a cleaner solution.

In this post, I will share both ways to create a recursive call of a Logic App.

Commun Parts


Let's assume that our goal is to crawl a folder structure. It could be in any file connector: DroxBox, OneDrive, Google Drive, etc. For this post, I will use Sharepoint Online connector.

First, let's create our Logic App. Use the Http Request-Response template.

  1. Our Logic App will receive the folder path, that it needs to process, as a parameter. This is easily done by defining a JSON schema in the Request trigger.
    {
        "$schema": "http://json-schema.org/draft-04/schema#",
        "properties": {
            "FolderName": {
            "type": "string"
            }
        },
        "type": "object"
    }
  2. To list all the content of the current folder. Add an action List folder from the SharePoint Online connector. The File identifier should be the trigger parameter: FolderName.

    Note: You need to edit the code behind for this action. I notice a strange behavior with space in the folder path.

    The current code should look like this:
    "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://mysharepoint.sharepoint.com/sites/FrankSiteTest'))}/folders/@{encodeURIComponent(triggerBody()?['FolderName'])}" 

    Change it by doubling the encodeURIComponent
    "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://mysharepoint.sharepoint.com/sites/FrankSiteTest'))}/folders/@{encodeURIComponent(encodeURIComponent(triggerBody()?['FolderName']))}" 
  3. For each element returned we need to check if it's a folder. One property of the returned object is IsFolder, we just need to use it in our condition:
    @equals(bool(item()?['IsFolder']), bool('True'))
    1. If it's a folder, we need to do some recursive call passing the path of the current folder to FolderName.
    2. Otherwise, when it's a file, do some process.

LogicApp_commun

Method 1: The quick and easy


Since we are forced to call a different Logic App, let's clone our Logic App. Close the editor and click the Clone button, name it FolderCrawler2.

cloneLogicApp

Now we need to edit both Logic apps by adding a call to the other one. FolderCrawler is calling FolderCrawler2 and FolderCrawler2 calls FolderCrawler.

FolderCrawler2

This method is really just a workaround, but it works perfectly. What I like about it is that it uses all the Intellisense at our disposal in the editor. Obviously, the big disadvantage is the code duplication.

Method 2: The Clean and light


The real way to do a recursive call is to use the URL from the Request in an HTTP POST call. Than in the body pass a JSON matching the schema containing the Path value.

httpcall

I hope you enjoy this little post. If you think of another way to do recursive call or if you have some questions, let me know!





Reading Notes #276

roslynSuggestion of the week


Cloud


Programming


Miscellaneous

Reading Notes #275

IMG_20170409_152323Cloud


Programming


Miscellaneous



~Frank



Reading Notes #274

2017-04-03_5-57-51Suggestion of the week


Cloud


Programming



Passing a file from an Azure Logic App to a Web API

(Ce billet en aussi disponible en français.)

Logic App is one of my favorite tools in my cloud toolbox. It's very easy to connect things together, something without even coding! Last week, I needed to pass a file from a SharePoint folder to an API. I moved files tons of times using Azure Logic Apps, but this time something was not working. Thanks to Jeff Hollan (@jeffhollan) who put me on the good path by giving me great advice, my problem was quickly solved. In this post, I will share with you the little things that make all the difference in this case.

The Goal


When a file is created in a SharePoint folder, an Azure Logic App needs to get triggered and passes the file name and its content to a Web Api. In this case, I'm using Sharepoint, but it will work the same way for all folder connector types (ex: DropBox, OneDrive, Box, GoogleDrive, etc.)

Note:
In this post, I'm using a SharePoint Online, but the same thing could perfectly work with a SharePoint on premise or in a Virtual machine. In this situation, On-premise Data Gateway needs to be installed locally. It's very easy to do, just follow the instruction. One gotcha... You MUST use the same Microsoft account of type "work or school" to connect to the Azre.portal.com and installing the On-premise Data Gateway.

The Web API App


Let's start by building our Web API. In Visual studio create a new Web API App. If you would like to have more details about how to create one see my previous post. Now, create a new controller and add a new function UploadNewFile with the following code:

[SwaggerOperation("UploadNewFile")]
[SwaggerResponse(HttpStatusCode.OK)]
[Route("api/UploadNewFile")]
[HttpPost]
public HttpResponseMessage UploadNewFile([FromUri] string fileName)
{
    if (string.IsNullOrEmpty(fileName))
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Name.");
    }

    var filebytes = Request.Content.ReadAsByteArrayAsync();

    if (filebytes.Result == null || filebytes.Result.Length <= 0)
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Content.");
    }

    // Do what you need with the file.

    return Request.CreateResponse(HttpStatusCode.OK);
}

The tag [FromUri] before the parameter is just a way to specify where that information is coming from. The content of the file couldn't be passed in the querystring, so it will be passed through the body of our HTTP Request. And it will be retrieved with the code Request.Content.ReadAsByteArrayAsync(). If everything works we return a HttpResponseMessage with the HttpStatusCode.OK otherwise some message about the problem. You can now publish your Wep API App.

In order to be able to see our WebAPI App from our Logic App, one more thing needs to be done. From the Azure portal, select the freshly deployed App Service and from the options section (the left area with all properties) select CORS, then type * and save it.

changeCORS

The Logic App


Assuming that you already have a SharePoint up and running, let's create the new Logic App. Once the Logic App is deployed click the edit button to go in the designer. Select the Blank template. In this post, I need a SharePoint trigger when a New File is created. At this point, you will be asked to answer a few questions in order to create your SharePoint connector. Once it's done select the folder where you will be "dropping" your files.

Now that the trigger is done, we will add our first (an only) action. Click Add Step. Select available functions, then our App Service and finally the method UploadNewFile.
SelectApiApp
Thanks to swagger, Logic App will be able to generate a parameter form for us. Put the filename in the Filename parameter textbox. The Logic App should look like this.

FullLogicApp

The last thing we need to do is specify to our Logic App to pass the file content to the body of the HTTP request to the API. Today, it's not possible to do it using the interface. As you probably know, behind that gorgeous sits a simple json document, and it's by editing this one that we will be able to specify how to pass the file content.

Switch to Code view, and find the step that calls our API App. Simply add "body": "@triggerBody()" to that node. That will tell Logic App to bind the body of the trigger (the file content) and pass-it to the body of our web request. The code should look like this:

"UploadNewFile": {
    "inputs": {
        "method": "post",
        "queries": {
        "fileName": "@{triggerOutputs()['headers']['x-ms-file-name']}"
        },
        "body": "@triggerBody()",
        "uri": "https://frankdemo.azurewebsites.net/api/UploadNewFile"
    },
    "metadata": {
        "apiDefinitionUrl": "https://frankdemo.azurewebsites.net/swagger/docs/v1",
        "swaggerSource": "website"
    },
    "runAfter": {},
    "type": "Http"
}

You can now save and exit the edit mode. The solution is ready, enjoy!

References:

Reading Notes #273

Frank_AzureFunction-2Cloud


Programming


Miscellaneous



Secure a Asp.Net MVC multi-tenant Power Bi Embedded hosted in an Azure WebApp

Note: This post was originally published on Microsoft MVP Award blog, as part of the Technical Tuesday series.

Power Bi gives us the possibility to create amazing reports. Even if it's great to be able to share those reports from the very secure Power Bi portal sometimes we need to share them inside other applications or websites. Once again, Power BI doesn't disappoint us by providing Power BI Embedded. In this post, I will explain how to use Power Bi Embedded and make it secure so each tenant can only his data.

The Problem

Despite many online exist that explain how to use filters to change the witch is visible in our reports, filters can easily be changed by the user. Even if you hide the filter panel, those setting could easily be modified using JavaScript... Therefor, it's definitely not the best way to secure private information.

The Solution

In this post, I will be using roles to limit the access the data. The well knew the database Adventure Works will be used to demonstrate how to partition the data. In this case will be using the customer table.

In Azure

Open the Azure portal to create a Power BI Embedded component. Of course in a real project, it would be better to create it in an Azure Resource Management (ARM) template, but to keep this post simple we will create it with the portal. Click on the big green "+" at the top left corner. In the search box type powerbi, and hit Enter. Select Power BI Embedded in the list and click the Create button. Once it's created go to the Access Keys property of the brand-new Power BI Workspace Collection and take note of Key. We will need that key later to upload our Power BI report.

CreateWorkSpaceCollection

For this demo, the data source will be Adventure Works in an Azure Database. To do it simply click again the "+" button and select Database. Be sure to select Adventure Works as the source if to reproduce this demo.

createDB


In Power BI Desktop

Power BI Desktop is a free tool from Microsoft that will help us to create our report; it can be download here.
Before we get started, two options need to be modified. Go in the File menu and select Options and Settings, then Options. The first onr, is in the section (tab) Preview Features; check the option: Enable cross filtering in both direction for DirectQuery. The second is in the section DirectQuery, check the option Allow unrestricted measures in DirectQuery mode. It's a good idea to restart Power BI Desktop before continuing.

powerbioptions

To create our reports we first need to connect to our datasource, in this case our Azure Database. Click the Get Data button, then Azure and after that Microsoft Azure SQL Database. It's important to be attentive on the type of connection Import or Direct Query, because you won't be able to change it after. You will need to rebuild your report from scratch. For this case select DirectQuery.
This chart will be displaying information about invoice detail. Be sure to include the table that will be used for your role. In this case, I will be using Customer. Each customer must see only their invoices.

 tables

The report will contain two charts: the left one is a bar chart where you see the invoice historic, the right one is a pie chart that shows how products in the invoice(s) are distributed by category.
Note: in the sample database all customer have only one invoice and hey are all at the same date

chart_noRole

Now we need to create our dynamic Role. In the Modeling tab click on Manage Roles and create a CustomerRole mapping the CompanyName of the customer table to the variable USERNAME()

genericRole

Of course, to test if our charts are really dynamics, create other roles, and give them specific values ex: "Bike World" or "Action Bicycle Specialists". To visualize your report as those user, simply click on the View as Roles, in the Modeling tab, and select the role you want.

ViewAs

See how the charts look when see from "Action Bicycle Specialists".

chart_withRole

The report is now ready. Save it and we will need it soon.


Powerbi-cli

To upload our report in our Azure Workspace Collection, I like to use PowerBI-CLI because it runs everywhere, thanks to Node.js.
Open a command prompt or Terminal and execute the following command to install PowerBI-CLI:
npm install powerbi-cli -g
Now if you type 'powerbi' you should have the powerbi-cli help display.

powerbicli

It's time to use the access key we got previously, and use it in this command to create a workspace in our workspace collection.

//== Create Workspace ===========
powerbi create-workspace -c FrankWrkSpcCollection -k my_azure_workspace_collection_access_key

Now, let's upload our Power BI report into Azure. Retrieve the workspace ID returned by the previous command and pass it as the parameter -w (workspace).

//== Import ===========
powerbi import -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -f "C:\powerbidemo\CustomerInvoices.pbix" -n CustomerInvoices -o

Now we will need to update the connectionstring of our dataset. Get his ID with the following command:

//== Get-Datasets ===========
powerbi get-datasets -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key 

Now update the connectionstring, passing the datasetId with the parameter -d:

//== update-connection ===========
powerbi update-connection -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -d 01fcabb6-1603-4653-a938-c83b7c45a59c -u usename@servername -p password


In Visual Studio

All the PowerBi Embeded part is now completed. Let's create the new Asp.Net MVC Web Application. A few Nuget packages are required, be sure to have those versions or newest:
  • Microsoft.PowerBI.AspNet.Mvc version="1.1.7"
  • Microsoft.PowerBI.Core version="1.1.6"
  • Microsoft.PowerBI.JavaScript version="2.2.6"
  • Newtonsoft.Json version="9.0.1"
By default Newtonsoft.Json is already there but needs an upgrade.
Update-Package Newtonsoft.Json
And for the Microsoft.PowerBI one, an install command should take care of all the other dependencies.

Install-Package Microsoft.PowerBI.AspNet.Mvc

We also need to add all the access information we previously used in our powerbi-Cli into our application. Let's add them in the web.config.

...
<appSettings>
    <add key="powerbi:AccessKey" value="my_azure_workspace_collection_access_key" />
    <add key="powerbi:ApiUrl" value="https://api.powerbi.com" />
    <add key="powerbi:WorkspaceCollection" value="FrankWrkSpcCollection" />
    <add key="powerbi:WorkspaceId" value="01fcabb6-1603-4653-a938-c83b7c45a59c" />
</appSettings>
...

Here the code of the InvoicesController:

using System;
using System.Configuration;
using System.Linq;
using System.Web.Mvc;
using demopowerbiembeded.Models;
using Microsoft.PowerBI.Api.V1;
using Microsoft.PowerBI.Security;
using Microsoft.Rest;
namespace demopowerbiembeded.Controllers
{
    public class InvoicesController : Controller
    {
        private readonly string workspaceCollection;
        private readonly string workspaceId;
        private readonly string accessKey;
        private readonly string apiUrl;
        public InvoicesController()
        {
            this.workspaceCollection = ConfigurationManager.AppSettings["powerbi:WorkspaceCollection"];
            this.workspaceId = ConfigurationManager.AppSettings["powerbi:WorkspaceId"];
            this.accessKey = ConfigurationManager.AppSettings["powerbi:AccessKey"];
            this.apiUrl = ConfigurationManager.AppSettings["powerbi:ApiUrl"];
        }
        private IPowerBIClient CreatePowerBIClient
        {
            get
            {
                var credentials = new TokenCredentials(accessKey, "AppKey");
                var client = new PowerBIClient(credentials)
                {
                    BaseUri = new Uri(apiUrl)
                };
                return client;
            }
        }
        public ReportViewModel GetFilteredRepot(string clientName)
        {
            using (var client = this.CreatePowerBIClient)
            {
                var reportsResponse = client.Reports.GetReportsAsync(this.workspaceCollection, this.workspaceId);
                var report = reportsResponse.Result.Value.FirstOrDefault(r => r.Name == "CustomerInvoices");
                var embedToken = PowerBIToken.CreateReportEmbedToken(this.workspaceCollection, this.workspaceId, report.Id, clientName, new string[] { "CustomerRole" });
                var model = new ReportViewModel
                {
                    Report = report,
                    AccessToken = embedToken.Generate(this.accessKey)
                };
                return model;
            }
        }
        public ActionResult Index()
        {
            var report = GetFilteredRepot("Action Bicycle Specialists");
            return View(report);
        }
    }
}

The interesting part of this controller is in the method GetFilteredRepot. First, it gets all the reports from our workspaces than look for the one named: "CustomerInvoices". The next step is where the loop gets closed; it creates the token. Of course, we pass the workspacecollection, workspace and report references, and that could be it. I mean passing only those references would result to our reports where all customers were displayed... But obviously that not what we want right now. The two last parameters are username and an Array of roles. When we created roles in Power BI Desktop, we created one call CustomerRole that was equal to the variable USERNAME(). So here we will pass the client name as username and specify that we want to use the role "CustomerRole".
Last piece to the puzzle is the View, so let add one.

@model demopowerbiembeded.Models.ReportViewModel
<style>iframe {border: 0;border-width: 0px;}</style>
<div id="test1" style="border-style: hidden;">
    @Html.PowerBIReportFor(m => m.Report, new { id = "pbi-report", style = "height:85vh", powerbi_access_token = Model.AccessToken })
</div>
@section scripts
{
    <script src="~/Scripts/powerbi.js"></script>
    <script>
        $(function () {
            var reportConfig = {
                settings: {
                    filterPaneEnabled: false,
                    navContentPaneEnabled: false
                }
            };
            var reportElement = document.getElementById('pbi-report');
            var report = powerbi.embed(reportElement, reportConfig);
        });
    </script>
}

One great advantage of using Asp.Net MVC is that we have an @Html.PowerBIReportFor at our disposal. Then we can instantiate the report with the call of powerbi.embed(reportElement, reportConfig);. Where I pass some configuration to remove the navigation, and the filter panes, but that optional.

Now if we run our project, you should have a result looking like that.

finalresult


Wrap it up

Viola! This of course was a demo and should be optimized. Please leave a comment if you have any questions, or don't hesitate to contact me. It's always great to chat with you.


References:



Reading Notes #272

Show_me_the_wayCloud


Programming


Databases