Showing posts with label post. Show all posts
Showing posts with label post. Show all posts

Lessons learned when deploying multiple databases to Azure with VSTS

It's had been a while since I worked into Visual Studio Team Services (VSTS), and it was a real pleasure to get back in that area. For the solution I was working on, we need to keep the current database up and running while deploying a new version. For this purpose, we decided to append the release number to database name (ex: MyDatabase363). In our Build and Release processes, we needed to identify which databases are from the last release. In this post, I will show what I did using an inline PowerShell script to get that number and set it as an environment variable so it can be accessible by other tasks.

To get started let's add a Azure PowerShell task to our build definition. In this post, I use a build process but of course this is also valid for release process. To find the task quickly, use the search text box. I will add two of those, one to get the number, the second to validate that this value is now set as an environment variable and readable from other tasks.
AddPowerShellTask

Now it's time to set the first task. Fill-out all the properties and select Inline Script as the Script Type. It should look like this.

InlineScript

Let's examine the code.
Get last Release Number
$matchingResources = Find-AzureRmResource -ResourceNameContains "mydatabase" -ResourceType "Microsoft.Sql/servers/databases"

$lastRelease = 0

ForEach($resource in $matchingResources)
{
    if ($resource.ResourceName -match '(\d)+$') {
        if($lastRelease -lt $matches[0]){
            $lastRelease = $matches[0]
        }
    }
}
Write-Output "The last release number is:  $lastRelease"
Write-Output ("##vso[task.setvariable variable=lastReleaseNumber;]$lastRelease")
On the first line, I use the Azure PowerShell commandlet Find-AzureRmResource1 to get an array of all the databases currently online in my resource group that contains a specific string. In this case, it's the name of the database without the release number. Then I will loop through all returned resources and using a very simple Regex to extract the release number and keep the biggest one (the last release).

To close that script we have two outputs. The first one is to give feedback in the logs, because it's always good to have some information there. The second one look more complicated, but if you split it, it's easier to see what's happening. In fact, we are producing a VSTS (previously called Visual Studio Online this is why it's VSO) command to initialize a variable ##vso[task.setvariable variable=lastReleaseNumber;] And of course, assign to it our last release number $lastRelease

To validate that we really successfully found our last release number and assigned it to a variable, let's try to read it back but from another step. That will be easily done this code in the other step created:
Validate the last Release Number
$number = $env:lastReleaseNumber

Write-Output "Confirmation, the last Release Number is:  $number "
The only thing missing before we can run our test is to create that environment variable. To to it simply go in the Variables tab and add it there.

env-variable

It's all set, run the build and you should see something similar in your logs.

trace



References




Let’s talk Hybrid Cloud with Clemens Vasters at a Special Montreal Event.

CVastersThe summer is mostly here, and that usually mean a break for the community group, but the Azure Group of MSDevMtl still have one more surprise for you. Clements Vaster, the Microsoft's "Principal Messenger", driving global technical strategy for Microsoft's Azure Messaging platform services: Azure Service Bus, Azure Event Hubs, and Azure Relay, is coming to Montreal!

Join us on the Friday June 9th 2017, for a great talk about “Hybrid cloud”. These days most solution need to run by a combination of on-premises and public cloud assets. In this morning session, we will learn how those systems can be integrated. This perspective is, however, largely constrained to data-center like assets, and typically to the scope of one organization.

For this special event, we decided to remove the usual fee, so it's a FREE event!

Register quickly on the meetup site because the places are limited. Register here: https://www.meetup.com/msdevmtl/events/239731426/


See you there!


From a Docker container to MySQL as a Service in Azure in 5 minutes

Hello MySQL! It's been a while eh? You were at version 3 something, I was just getting stated with my professional career. We had fun for years... Then you know things changed, and I did something else. I was really happy when Microsoft announced, at the MSBuild,  the availability of MySQL as a Servive in Azure.
 
SearchMySQL

Creating a MySQL database with the portal is extremely simple. As usual, you enter the server name, database name and the Admin's password. At the time I'm writing this post, it was not possible to use any CLI, but I'm sure it will be available shortly. For the ones who are not used at Database as Service in Azure, one thing you will need to do to get access to your database from your computer is white listed your IP. It's very easy to do from the Azure Portal, just select the Connection Security tab on the left menu and add your address. Oh! And don't forger to click the save button. ;)

Firewall

During my tests, I've tried different applications (WordPress, Azure WebApp, custom on-premise app.) that use MySQL as backend database, I didn't notice any problem, and performance were great. It was just... simpler; no server to configure, no VM to configure, no update. The only "issue" I got was trying to connect Power BI Desktop to a MySQL, but I think it more related to the drivers since the service was still in early preview. I notified Microsoft, and I'm sure it will be available shortly.

Since it's been a while since I did some reel work with MySQL I didn't have any client install on my laptop. In fact, I had no idea which one I should take.

I knew we can run some CLI inside a Docker container with an interactive interface. So I decided to give it a try. A quick docker search mysql shows me that an image existed. Here are the steps to get setup.

First, let's download the image, and create an instance named mySQLTools of MySQL 8.0:

docker run --name mySQLTools --env "MYSQL_ROOT_PASSWORD=Passw0rd" -d mysql:8

Then using the -it let's bring the bash prompt to our terminal.

docker exec -it mySQLTools bash -l 

Finally we connect to our client using the usual settings (note that you must have no space between -p and your password):

mysql -h _ServerName.database.windows.net_  -u _UserName@ServerName_ -p_MyPassword_ _DatabaseName_

result

Voila! That's all what it takes to get started. And by the way, it will also work great with a Azure SQL Database.


docker pull shellmaster/sql-cli 

docker run -it --rm --name=sqlTools shellmaster/sql-cli mssql -s ServerName.database.windows.net  -u UserName@ServerName  -p YourPassword -d DatabaseName -e





A static website and some little tricks with Azure Functions Proxies

(Updated 2018-02-08)

Recently, I did few presentations about Azure functions. The reaction was always very positive and attendees leave with tons of ideas of projects in their heads. In this post, I would like to add few interesting features that I didn't have the time to talk about.

You prefer to watch a video instead of reading? No problem, skip at the end at the Explain in a Video of this post immediately.

Let's get started


From the Azure Portal (portal.azure.com) select an Azure Function domain, or create a new one. Then we need to create a Function App that we will be use as our backend. Click on the "+" sign at the side of Function. In this post, we will be using the HttpTrigger-CSharp template, but other template will work too. Once you select the template you will be able to enter the name and select the Authorization level. This last choice will affect how your function could be accessed. For exemple, if you select Anonymous then you function will be accessible to everyone directly using the url: https://notesfunctions.azurewebsites.net/api/SecretFunction, where 'notesfunctions' is my function domain name. But if you select Function or Admin level, then you will need to pass a Function Key or Master Key (ex: https://notesfunctions.azurewebsites.net/api/SecretFunction?code=I4BN6NjaZBmPNqebqnX8pQAPZwe1TI/O4TCbvB1aaaaao7uRO5yyw==). For this post let's use the Function level. When ready click the Create button.

CreateFunction

Using Postman, your favorite HTTP tool, or even the function Test section (located on the right side of the editor in the Function blade), you can now test your Function. To be able to test our function, we need to know the URL. To get the URL of your functions, once you function is selected (when you see the code), click on </> Get function URL on the top of the screen.

TestSecretDirect

Note that the querystring as a parameter named code that is receiving our function key. When this parameter is not present, you will receive an HTTP 401 Unauthorized message. The function generated by the template also expects a value 'Name' that could be passed by the querystring or by a json file with a property Name in the http request body.

Azure Functions Proxies


Functions Proxies are currently in preview. With them, any function app can now define an endpoint that serves as a reverse proxy to another [API / webhook / function App / anything else].
Before being able to create new Azure Functions Proxies, you need to enable them. From the Function blade, select the Settings tab on the top of the screen, then click the On button, under the Proxie section.

EnableProxies

Now let's create our first Function Proxy. Click on the "+" on the right of Proxies (Preview). Enter the following values.

ProxySalutation

In the Backend URL note as %Host_Name% is used in the URL; this is NOT an environment variable. In fact surrounding a key with % is a very useful tool from the Azure Function that gives us the ability to read directly in the Application settings.

PlatformFeatures

To get to the Application settings, select the Function Application domain (the root node), then the tab Platform features from the top of the screen. In the image above, Point A shows on to access the Application settings, and Point B shows how to access the App Service Editor that will use later in this post.

If it's not already done, add a new key-value in Application setting: Host_Name with his value. Then from Postman, call this new proxy function. Note that now you don't need to pass the key since this part is done under the hood by the proxy.

TestSalutation

Do more with your Proxies


Okay, now that we have a proxy up and running, let's switch to the App Service Editor to do more "advanced" stuff (the Editor is available throuth the Platform features tab). Once you are in the editor select the file proxies.json to open it.

editor

As you can see we only have one proxy defined. Let's duplicate our proxy. Rename the copy "Override", and change the route value for override too. If you test this new proxy, it will work just as the other one. Let's change that a little, under the property backendUri add a new node called: responseOverrides. It is possible with proxies to edit the HTTP properties. To change the Content-Type to text instead of json add "response.headers.Content-Type": "text/plain" inside our new node responseOverrides (be aware, it's case sensitive). Test again Override and you will constate that the content indeed has changed.

Continuing that way you count use Azure Function Proxies as mock. For example, replace the backendUri property and override the response body to return a fix value, and voila! You built yourself a great mock-up! This is very useful! To illustrate this, add a new proxy using this code:
"Fake": {
    "matchCondition": {
        "route": "fake"
    },
    "responseOverrides": {
        "response.headers.Content-Type": "text/plain",
        "response.body": "Hello from Azure"
    }
}
If you call this last proxy, no backend will be called, but the HTTP call is working.

Static WebSite


Everybody knows that Azure storage is very inexpensive. Would it be wonderful if we could put a static website in that storage? Of course, you can do it, I mean as long as the URL was complete. However, who type the URL completely with the file and file extention (ex: http://www.frankysnotes.com/index.html)? Well now with Azure Function Proxy, we could fix that! Add another proxy to the proxies.json file using this code:
"StaticNotes": {
    "matchCondition": {
        "methods": [
            "GET"
        ],
        "route": "/"
    },
    "backendUri": "https://%blob_url%/dev/index.html"
}
This new proxy will "redirect" all root HTTP GET calls to our index.html file waiting in our Azure Blob storage. For a more professional look, you just need to add a custom domain name to your Function, and you got the perfect super-light low-cost website for your promotion campaign, or event.

static


Explain in a Video





References:

  • Postman : getpostman.com
  • App Service Editor: https://{function domain name}.scm.azurewebsites.net (ex: https://notesfunctions.scm.azurewebsites.net)



Two Ways To Build a Recursive Logic App

Mostly everything is possible. It's always a question of how much time and energy we have. The other day I was building an Azure Logic App, and need it to make it recursive. First, I thought that couldn't be a problem because we can easily call a Logic APP from another one. This is right, nested Logic App is possible, but it's only possible to call another one. The "compilator" doesn't allow to do recursive call. I quickly found a workaround, but the day after I came to a cleaner solution.

In this post, I will share both ways to create a recursive call of a Logic App.

Commun Parts


Let's assume that our goal is to crawl a folder structure. It could be in any file connector: DroxBox, OneDrive, Google Drive, etc. For this post, I will use Sharepoint Online connector.

First, let's create our Logic App. Use the Http Request-Response template.

  1. Our Logic App will receive the folder path, that it needs to process, as a parameter. This is easily done by defining a JSON schema in the Request trigger.
    {
        "$schema": "http://json-schema.org/draft-04/schema#",
        "properties": {
            "FolderName": {
            "type": "string"
            }
        },
        "type": "object"
    }
  2. To list all the content of the current folder. Add an action List folder from the SharePoint Online connector. The File identifier should be the trigger parameter: FolderName.

    Note: You need to edit the code behind for this action. I notice a strange behavior with space in the folder path.

    The current code should look like this:
    "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://mysharepoint.sharepoint.com/sites/FrankSiteTest'))}/folders/@{encodeURIComponent(triggerBody()?['FolderName'])}" 

    Change it by doubling the encodeURIComponent
    "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://mysharepoint.sharepoint.com/sites/FrankSiteTest'))}/folders/@{encodeURIComponent(encodeURIComponent(triggerBody()?['FolderName']))}" 
  3. For each element returned we need to check if it's a folder. One property of the returned object is IsFolder, we just need to use it in our condition:
    @equals(bool(item()?['IsFolder']), bool('True'))
    1. If it's a folder, we need to do some recursive call passing the path of the current folder to FolderName.
    2. Otherwise, when it's a file, do some process.

LogicApp_commun

Method 1: The quick and easy


Since we are forced to call a different Logic App, let's clone our Logic App. Close the editor and click the Clone button, name it FolderCrawler2.

cloneLogicApp

Now we need to edit both Logic apps by adding a call to the other one. FolderCrawler is calling FolderCrawler2 and FolderCrawler2 calls FolderCrawler.

FolderCrawler2

This method is really just a workaround, but it works perfectly. What I like about it is that it uses all the Intellisense at our disposal in the editor. Obviously, the big disadvantage is the code duplication.

Method 2: The Clean and light


The real way to do a recursive call is to use the URL from the Request in an HTTP POST call. Than in the body pass a JSON matching the schema containing the Path value.

httpcall

I hope you enjoy this little post. If you think of another way to do recursive call or if you have some questions, let me know!





Passing a file from an Azure Logic App to a Web API

(Ce billet en aussi disponible en français.)

Logic App is one of my favorite tools in my cloud toolbox. It's very easy to connect things together, something without even coding! Last week, I needed to pass a file from a SharePoint folder to an API. I moved files tons of times using Azure Logic Apps, but this time something was not working. Thanks to Jeff Hollan (@jeffhollan) who put me on the good path by giving me great advice, my problem was quickly solved. In this post, I will share with you the little things that make all the difference in this case.

The Goal


When a file is created in a SharePoint folder, an Azure Logic App needs to get triggered and passes the file name and its content to a Web Api. In this case, I'm using Sharepoint, but it will work the same way for all folder connector types (ex: DropBox, OneDrive, Box, GoogleDrive, etc.)

Note:
In this post, I'm using a SharePoint Online, but the same thing could perfectly work with a SharePoint on premise or in a Virtual machine. In this situation, On-premise Data Gateway needs to be installed locally. It's very easy to do, just follow the instruction. One gotcha... You MUST use the same Microsoft account of type "work or school" to connect to the Azre.portal.com and installing the On-premise Data Gateway.

The Web API App


Let's start by building our Web API. In Visual studio create a new Web API App. If you would like to have more details about how to create one see my previous post. Now, create a new controller and add a new function UploadNewFile with the following code:

[SwaggerOperation("UploadNewFile")]
[SwaggerResponse(HttpStatusCode.OK)]
[Route("api/UploadNewFile")]
[HttpPost]
public HttpResponseMessage UploadNewFile([FromUri] string fileName)
{
    if (string.IsNullOrEmpty(fileName))
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Name.");
    }

    var filebytes = Request.Content.ReadAsByteArrayAsync();

    if (filebytes.Result == null || filebytes.Result.Length <= 0)
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Content.");
    }

    // Do what you need with the file.

    return Request.CreateResponse(HttpStatusCode.OK);
}

The tag [FromUri] before the parameter is just a way to specify where that information is coming from. The content of the file couldn't be passed in the querystring, so it will be passed through the body of our HTTP Request. And it will be retrieved with the code Request.Content.ReadAsByteArrayAsync(). If everything works we return a HttpResponseMessage with the HttpStatusCode.OK otherwise some message about the problem. You can now publish your Wep API App.

In order to be able to see our WebAPI App from our Logic App, one more thing needs to be done. From the Azure portal, select the freshly deployed App Service and from the options section (the left area with all properties) select CORS, then type * and save it.

changeCORS

The Logic App


Assuming that you already have a SharePoint up and running, let's create the new Logic App. Once the Logic App is deployed click the edit button to go in the designer. Select the Blank template. In this post, I need a SharePoint trigger when a New File is created. At this point, you will be asked to answer a few questions in order to create your SharePoint connector. Once it's done select the folder where you will be "dropping" your files.

Now that the trigger is done, we will add our first (an only) action. Click Add Step. Select available functions, then our App Service and finally the method UploadNewFile.
SelectApiApp
Thanks to swagger, Logic App will be able to generate a parameter form for us. Put the filename in the Filename parameter textbox. The Logic App should look like this.

FullLogicApp

The last thing we need to do is specify to our Logic App to pass the file content to the body of the HTTP request to the API. Today, it's not possible to do it using the interface. As you probably know, behind that gorgeous sits a simple json document, and it's by editing this one that we will be able to specify how to pass the file content.

Switch to Code view, and find the step that calls our API App. Simply add "body": "@triggerBody()" to that node. That will tell Logic App to bind the body of the trigger (the file content) and pass-it to the body of our web request. The code should look like this:

"UploadNewFile": {
    "inputs": {
        "method": "post",
        "queries": {
        "fileName": "@{triggerOutputs()['headers']['x-ms-file-name']}"
        },
        "body": "@triggerBody()",
        "uri": "https://frankdemo.azurewebsites.net/api/UploadNewFile"
    },
    "metadata": {
        "apiDefinitionUrl": "https://frankdemo.azurewebsites.net/swagger/docs/v1",
        "swaggerSource": "website"
    },
    "runAfter": {},
    "type": "Http"
}

You can now save and exit the edit mode. The solution is ready, enjoy!

References:

Secure a Asp.Net MVC multi-tenant Power Bi Embedded hosted in an Azure WebApp

Note: This post was originally published on Microsoft MVP Award blog, as part of the Technical Tuesday series.

Power Bi gives us the possibility to create amazing reports. Even if it's great to be able to share those reports from the very secure Power Bi portal sometimes we need to share them inside other applications or websites. Once again, Power BI doesn't disappoint us by providing Power BI Embedded. In this post, I will explain how to use Power Bi Embedded and make it secure so each tenant can only his data.

The Problem

Despite many online exist that explain how to use filters to change the witch is visible in our reports, filters can easily be changed by the user. Even if you hide the filter panel, those setting could easily be modified using JavaScript... Therefor, it's definitely not the best way to secure private information.

The Solution

In this post, I will be using roles to limit the access the data. The well knew the database Adventure Works will be used to demonstrate how to partition the data. In this case will be using the customer table.

In Azure

Open the Azure portal to create a Power BI Embedded component. Of course in a real project, it would be better to create it in an Azure Resource Management (ARM) template, but to keep this post simple we will create it with the portal. Click on the big green "+" at the top left corner. In the search box type powerbi, and hit Enter. Select Power BI Embedded in the list and click the Create button. Once it's created go to the Access Keys property of the brand-new Power BI Workspace Collection and take note of Key. We will need that key later to upload our Power BI report.

CreateWorkSpaceCollection

For this demo, the data source will be Adventure Works in an Azure Database. To do it simply click again the "+" button and select Database. Be sure to select Adventure Works as the source if to reproduce this demo.

createDB


In Power BI Desktop

Power BI Desktop is a free tool from Microsoft that will help us to create our report; it can be download here.
Before we get started, two options need to be modified. Go in the File menu and select Options and Settings, then Options. The first onr, is in the section (tab) Preview Features; check the option: Enable cross filtering in both direction for DirectQuery. The second is in the section DirectQuery, check the option Allow unrestricted measures in DirectQuery mode. It's a good idea to restart Power BI Desktop before continuing.

powerbioptions

To create our reports we first need to connect to our datasource, in this case our Azure Database. Click the Get Data button, then Azure and after that Microsoft Azure SQL Database. It's important to be attentive on the type of connection Import or Direct Query, because you won't be able to change it after. You will need to rebuild your report from scratch. For this case select DirectQuery.
This chart will be displaying information about invoice detail. Be sure to include the table that will be used for your role. In this case, I will be using Customer. Each customer must see only their invoices.

 tables

The report will contain two charts: the left one is a bar chart where you see the invoice historic, the right one is a pie chart that shows how products in the invoice(s) are distributed by category.
Note: in the sample database all customer have only one invoice and hey are all at the same date

chart_noRole

Now we need to create our dynamic Role. In the Modeling tab click on Manage Roles and create a CustomerRole mapping the CompanyName of the customer table to the variable USERNAME()

genericRole

Of course, to test if our charts are really dynamics, create other roles, and give them specific values ex: "Bike World" or "Action Bicycle Specialists". To visualize your report as those user, simply click on the View as Roles, in the Modeling tab, and select the role you want.

ViewAs

See how the charts look when see from "Action Bicycle Specialists".

chart_withRole

The report is now ready. Save it and we will need it soon.


Powerbi-cli

To upload our report in our Azure Workspace Collection, I like to use PowerBI-CLI because it runs everywhere, thanks to Node.js.
Open a command prompt or Terminal and execute the following command to install PowerBI-CLI:
npm install powerbi-cli -g
Now if you type 'powerbi' you should have the powerbi-cli help display.

powerbicli

It's time to use the access key we got previously, and use it in this command to create a workspace in our workspace collection.

//== Create Workspace ===========
powerbi create-workspace -c FrankWrkSpcCollection -k my_azure_workspace_collection_access_key

Now, let's upload our Power BI report into Azure. Retrieve the workspace ID returned by the previous command and pass it as the parameter -w (workspace).

//== Import ===========
powerbi import -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -f "C:\powerbidemo\CustomerInvoices.pbix" -n CustomerInvoices -o

Now we will need to update the connectionstring of our dataset. Get his ID with the following command:

//== Get-Datasets ===========
powerbi get-datasets -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key 

Now update the connectionstring, passing the datasetId with the parameter -d:

//== update-connection ===========
powerbi update-connection -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -d 01fcabb6-1603-4653-a938-c83b7c45a59c -u usename@servername -p password


In Visual Studio

All the PowerBi Embeded part is now completed. Let's create the new Asp.Net MVC Web Application. A few Nuget packages are required, be sure to have those versions or newest:
  • Microsoft.PowerBI.AspNet.Mvc version="1.1.7"
  • Microsoft.PowerBI.Core version="1.1.6"
  • Microsoft.PowerBI.JavaScript version="2.2.6"
  • Newtonsoft.Json version="9.0.1"
By default Newtonsoft.Json is already there but needs an upgrade.
Update-Package Newtonsoft.Json
And for the Microsoft.PowerBI one, an install command should take care of all the other dependencies.

Install-Package Microsoft.PowerBI.AspNet.Mvc

We also need to add all the access information we previously used in our powerbi-Cli into our application. Let's add them in the web.config.

...
<appSettings>
    <add key="powerbi:AccessKey" value="my_azure_workspace_collection_access_key" />
    <add key="powerbi:ApiUrl" value="https://api.powerbi.com" />
    <add key="powerbi:WorkspaceCollection" value="FrankWrkSpcCollection" />
    <add key="powerbi:WorkspaceId" value="01fcabb6-1603-4653-a938-c83b7c45a59c" />
</appSettings>
...

Here the code of the InvoicesController:

using System;
using System.Configuration;
using System.Linq;
using System.Web.Mvc;
using demopowerbiembeded.Models;
using Microsoft.PowerBI.Api.V1;
using Microsoft.PowerBI.Security;
using Microsoft.Rest;
namespace demopowerbiembeded.Controllers
{
    public class InvoicesController : Controller
    {
        private readonly string workspaceCollection;
        private readonly string workspaceId;
        private readonly string accessKey;
        private readonly string apiUrl;
        public InvoicesController()
        {
            this.workspaceCollection = ConfigurationManager.AppSettings["powerbi:WorkspaceCollection"];
            this.workspaceId = ConfigurationManager.AppSettings["powerbi:WorkspaceId"];
            this.accessKey = ConfigurationManager.AppSettings["powerbi:AccessKey"];
            this.apiUrl = ConfigurationManager.AppSettings["powerbi:ApiUrl"];
        }
        private IPowerBIClient CreatePowerBIClient
        {
            get
            {
                var credentials = new TokenCredentials(accessKey, "AppKey");
                var client = new PowerBIClient(credentials)
                {
                    BaseUri = new Uri(apiUrl)
                };
                return client;
            }
        }
        public ReportViewModel GetFilteredRepot(string clientName)
        {
            using (var client = this.CreatePowerBIClient)
            {
                var reportsResponse = client.Reports.GetReportsAsync(this.workspaceCollection, this.workspaceId);
                var report = reportsResponse.Result.Value.FirstOrDefault(r => r.Name == "CustomerInvoices");
                var embedToken = PowerBIToken.CreateReportEmbedToken(this.workspaceCollection, this.workspaceId, report.Id, clientName, new string[] { "CustomerRole" });
                var model = new ReportViewModel
                {
                    Report = report,
                    AccessToken = embedToken.Generate(this.accessKey)
                };
                return model;
            }
        }
        public ActionResult Index()
        {
            var report = GetFilteredRepot("Action Bicycle Specialists");
            return View(report);
        }
    }
}

The interesting part of this controller is in the method GetFilteredRepot. First, it gets all the reports from our workspaces than look for the one named: "CustomerInvoices". The next step is where the loop gets closed; it creates the token. Of course, we pass the workspacecollection, workspace and report references, and that could be it. I mean passing only those references would result to our reports where all customers were displayed... But obviously that not what we want right now. The two last parameters are username and an Array of roles. When we created roles in Power BI Desktop, we created one call CustomerRole that was equal to the variable USERNAME(). So here we will pass the client name as username and specify that we want to use the role "CustomerRole".
Last piece to the puzzle is the View, so let add one.

@model demopowerbiembeded.Models.ReportViewModel
<style>iframe {border: 0;border-width: 0px;}</style>
<div id="test1" style="border-style: hidden;">
    @Html.PowerBIReportFor(m => m.Report, new { id = "pbi-report", style = "height:85vh", powerbi_access_token = Model.AccessToken })
</div>
@section scripts
{
    <script src="~/Scripts/powerbi.js"></script>
    <script>
        $(function () {
            var reportConfig = {
                settings: {
                    filterPaneEnabled: false,
                    navContentPaneEnabled: false
                }
            };
            var reportElement = document.getElementById('pbi-report');
            var report = powerbi.embed(reportElement, reportConfig);
        });
    </script>
}

One great advantage of using Asp.Net MVC is that we have an @Html.PowerBIReportFor at our disposal. Then we can instantiate the report with the call of powerbi.embed(reportElement, reportConfig);. Where I pass some configuration to remove the navigation, and the filter panes, but that optional.

Now if we run our project, you should have a result looking like that.

finalresult


Wrap it up

Viola! This of course was a demo and should be optimized. Please leave a comment if you have any questions, or don't hesitate to contact me. It's always great to chat with you.


References:



Where can I put my Data In Azure


This month, I’m the guest of Mario Cardinal (@mario_cardinal) and Guy Barrette (@GuyBarrette) on their Podcast The Visual Studio Talk Show.  A French Podcast that talk software architecture with Microsoft's technology. 
Alexandre Brisebois (@Brisebois) was also present on this episode, and the four of us spent about an hour talking about Data in Azure, and try to clarify the Microsoft offer.

You can listen to the episode here:  http://visualstudiotalkshow.libsyn.com/205-alexandre-brisebois-et-franois-boucher-les-donnes-et-azure

I did a little “Mindmap” before the show to help me keeping it as structured as possible. I’m sharing it with you here:
Azure Data_thumb

Version (3231x1130) here: http://cloudenfrancais.com/content/images/2017/03/Azure-Data.png

~Frank

Connect VSCode to Azure SQL Database from everywhere

Visual Studio Code looks like a simple text editor. However, the fact that it running from anywhere (Windows, Linux, MacOs) and that you can highly customize it, make it very special. The way you customize your VSCode is by using extensions. The last extension I installed was MSSQL. In this post, I will explain how to leverage this extension to connect from everywhere to your database.

Installation


Adding the extension is very simple you can go to the Extension menu and type mssqm in the search area; then click Install.

install_mssql

Configuration


To create your connection profile, you could just open Command Palette and type connect, the extension will then ask you to enter the server name, username, password…

connect_longway
To avoid any typo I tend to copy/paste the information, but every time the dialog was losing the focus it was closing. The workarounds to this unfortunate problem was to open the User Settings and specify my connection information. Here is the configuration:

"mssql.connections": [
    {
        "server": "{{put-server-name-here}}",
        "database": "{{put-database-name-here}}",
        "user": "{{put-username-here}}",
        "password": "{{put-password-here}}"
    },
    {
        "profileName": "Azure SQL FBoucher",
        "server": "sqlservername.database.windows.net",
        "database": "demodb",
        "user": "admin",
        "password": "",
        "authenticationType": "SqlLogin",
        "savePassword": true
    }
]

I kept password empty and savePassword equal to true. This way the first time you connect to the database you will need to enter it and it will be saved outside of the user Setting using the Password Management.
Now to connect you only need to open the Command Palette and type connect or mssql, to see the MS SQL: Connect option and select the connection profile you just created.

connect_profile


Using the mssql extension


Of course, you can type any query you like, but the extension is offering not only snippets, but a contextual auto-completion. That mean the VSCode once connected will know all the column’s names of all tables and boost as must your productivity.

snippets
To execute the query a simple Ctrl + E, and voila!

query_and_result


Wrapping up


This fantastic extension is providing a light-weight setup to execute or write any SQL scripts whatever you are on Windows, Linux or MacOS. It doesn’t have all the features of the bigger tool like SQL Server Management, when databases are not your primary focus or even to have a light setup on the go, the Visual Studio Code SQL Server extension is definitely a must.


References:




Need to Nuke an Azure Subscription?

(Ce billet en aussi disponible en français.)


I use very intensely my my.visualstudio (aka MSDN) Azure subscription, to create content for a demo or just to try new feature. So frequently I need to do some cleaning.

multi-resourceGroup-boom

Here a little script that will completely delete all resources of every resources group inside a specific subscription. To be able to execute this script you will need Azure PowerShell cmdlets.

The script asks you to login-in then list all the subscriptions that this account has access. Once you specify which one, it will list all the resource grouped by resource group. Then as a final warning, it will require one last validation before nuking everything.

Be careful.

#================================================================
#= Very dangerous interactive script that delete all rescources 
#= from all rescourcegroup in a specific subscription
#================================================================

# How to install and configure Azure PowerShell
# https://docs.microsoft.com/en-us/powershell/azureps-cmdlets-docs/

# Login
Login-AzureRmAccount 

# Get a list of all Azure subscript that the user can access
$allSubs = Get-AzureRmSubscription 

>$allSubs | Sort-Object Name | Format-Table -Property ame, SubscriptionId, State

$theSub = Read-Host "Enter the subscriptionId you want to clean"

Write-Host "You select the following subscription. (it will be display 15 sec.)" -ForegroundColor Cyan
Get-AzureRmSubscription -SubscriptionId $theSub | Select-AzureRmSubscription 

#Get all the resources groups
$allRG = Get-AzureRmResourceGroup

foreach ( $g in $allRG){
    Write-Host $g.ResourceGroupName -ForegroundColor Yellow 
    Write-Host "------------------------------------------------------`n" -ForegroundColor Yellow 
    $allResources = Find-AzureRmResource -ResourceGroupNameContains $g.ResourceGroupName
    if($allResources){
        $allResources | Format-Table -Property Name, ResourceName
    }else{
        Write-Host "-- empty--`n"
    } 
    Write-Host "`n`n------------------------------------------------------" -ForegroundColor Yellow 
}

$lastValidation = Read-Host "Do you wich to delete ALL the resouces previously listed? (YES/ NO)"
if($lastValidation.ToLower().Equals("yes")){
    foreach ( $g in $allRG){
        Write-Host "Deleting " $g.ResourceGroupName 
        Remove-AzureRmResourceGroup -Name $g.ResourceGroupName -Force -WhatIf
    }
}else{
    Write-Host "Aborded. Nothing was deleted." -ForegroundColor Cyan
}




The code is also available on Github: https://github.com/FBoucher/AzurePowerTools


How to use Azure Function App to crush an SQL Database on a schedule

In a project, I needed to run a task every day to process some data in an Azure SQL Database. I thought Azure Function App would be the perfect candidate for that because we can attach them on a schedule, and I will only get charges when they are running. In this post, I will create a function that will be executed every five minutes. It will read the information from an SQL table Person and write the stats in another SQL table Statistic.

Azure Setup


Let's start by creating the Azure Function App. From the Azure portal (portal.azure.com) click the "+" sign on the top left corner and in the Search textbox type Function App. Fill-up the creation form like usual. Note that it will be a good idea to put your Function Appin the same location to what they will be interacting with; in this case an SQL Database. Once the Function App is created, it will be possible to create a new function. For that you can start with an empty one or use one of the multiple templates available. For this post, the TimeTrigger-CSharp was perfectly indicated.

createAzureFunctionApp

On this page that you can configure the schedule. Here, I set it to 0 */5 * * * * because the task will be running every five minutes. It uses Cron expression, and to learn about it you can (should) refer to the documentation. You can edit that value later by going to the Integrate tab the function. Great, now that we have our function, we need to provide the SQL Database connection string. In this demo, I will use App Settings, but it could also be saved in the Azure Key Vault. Remember Function Apps are part of the Azure Apps ecosystem, thereby they have App Settings. To access it, it's really simple.

GoToAppSettings

In the left panel menu click on the Function app setting options, then in the many choices look for Configure app settings. That will open the usual app setting blade that we use with other Web Apps. Scroll down until you've reached the Connection String section. Add your connection string to your server... And don't forget to save!

saveConnStr

Database Setup


For this demo, I will use two simple tables.
CREATE TABLE [dbo].[Person] (
    PersonID     INT NOT NULL IDENTITY(1,1) PRIMARY key,
    Firstname    VARCHAR(50)  NOT NULL, 
    Lastname     VARCHAR(50)  NOT NULL,
    Age          INT  NOT NULL
)
I will insert manually records, adding people with a random age. The Function App on its side, will read the information from the table Person and calculate the average age and the number of people in the population. For finally inserted it into the table Statistic.
CREATE TABLE [dbo].[Statistic] (
    StatisticID    INT NOT NULL IDENTITY(1,1) PRIMARY key,
    Population     INT NOT NULL, 
    AverageAge     INT  NOT NULL,
    DateTaken      VARCHAR(50)  NOT NULL
)

Coding the Function App


The code of this function is not styled or optimized. It was kept very simple so it was obvious was it was doing. Here is the code.
#r "System.Data"
using System;
using System.Configuration;
using System.Data.SqlClient;
public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executed at: {DateTime.Now}");  
    var str = ConfigurationManager.ConnectionStrings["sqlConn"].ConnectionString;
    using (SqlConnection conn = new SqlConnection(str))
    {
        var sum = 0;
        var cnt = 0;
        var avr = 0;
        conn.Open();
        var sqlStr = "SELECT Age FROM [dbo].[Person]";
        using (SqlCommand cmd = new SqlCommand(sqlStr, conn))
        {
            var dataReader = await cmd.ExecuteReaderAsync();
            while(dataReader.Read()){
                sum += dataReader.GetInt32(0);
                cnt++;
            }
            avr = (sum / cnt);
            log.Info($"The average actual population is {avr} .");
        }
        var sqlInsert = $"INSERT INTO [dbo].[Statistic](Population, AverageAge, DateTaken) VALUES ( {cnt} , {avr} , '{DateTime.Now}' )";
        using(SqlCommand cmd  = new SqlCommand(sqlInsert, conn)){
            var rows = cmd.ExecuteNonQuery();
            log.Info($"{rows} rows were inserted");
        }
    }  
}
The first line is to add reference to System.Data. Many libraries are already available to any Function App, and just require a reference. When you need an external library, you will need to create a file project.json and add it to the dependencies. Those will be loaded via Nuget. From there, the code talk by itself. Beginning by getting the connection string from the config, getting all the people and doing his magic, then finally inserting his result into the Statistic table.

Let's Run It


Everything is now in place. It's t to use your favorite SQL tool to insert some record into the Person table. Personally, I've created a little console application in .Net Core to do that. I share all about it at the end of this post. Once it's done, you will see the table Statistic growing every five minutes. Voila! Azure Function App a very useful and effective in this scenario. The code, of course, can be part of your repository, and now that Azure Function Tools have just been released you can even code them from Visual Studio!


Little Bonus


For a will now, every time I need to generate data, I tend to use a nice framework that my MVPs buddy the ASP.Net Monsters has done called: GenFu. Available on github, GenFu is a library that generates realistic test data. Here the code I used to generate the population in this demo.
using System;
using System.Data.SqlClient;
using GenFu;
namespace ConsoleApplication
{
    public class Program
    {
        public static void Main(string[] args)
        {
            int counter = (args.Length > 0) ? Convert.ToInt32(args[0]) : 5;
            var Users = A.ListOf<Person>(counter);
            var connStr = "Server=YOURSERVERNAME.database.windows.net,1433;Initial Catalog=YOURDATABASENAME;Persist Security Info=False;User ID=YOURUSER;Password=YOURPASSWORD;MultipleActiveResultSets=True;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;";
            using (SqlConnection conn = new SqlConnection(connStr))
            {
                conn.Open();
                foreach (var u in Users)
                {
                    var sqlStr = $"INSERT INTO [dbo].[Person] (Firstname,Lastname,Age) VALUES ( '{u.Firstname}' , '{u.Lastname}' , {u.Age} )";
                    using (SqlCommand cmd = new SqlCommand(sqlStr, conn))
                    {
                        var rows = cmd.ExecuteNonQuery();
                        Console.WriteLine($"{rows} rows were inserted");
                    }
                }
                conn.Close();
            }
        }
    }
    public class Person
    {
        public string Firstname { get; set; }
        public string Lastname { get; set; }
        public int Age { get; set; }
    }
}
You will need to add Genfu reference to the project.json file.
{
"version": "1.0.0-*",
"buildOptions": {
    "debugType": "portable",
    "emitEntryPoint": true
},
"dependencies": {
        "Genfu": "1.2.1",
        "System.Data.SqlClient": "4.3.0"
},
"frameworks": {
    "netcoreapp1.0": {
    "dependencies": {
        "Microsoft.NETCore.App": {
        "type": "platform", 
        "version": "1.0.1"
        }
    },
    "imports": "dnxcore50"
    }
},
"runtimes": {
"win-x64":{}
}
}





Happy coding!




Lessons learn while trying to deploy a Docker container in Azure

(Ce billet en aussi disponible en français.)


Since I saw Donovan Brown do his demo during the Connect(); // 2016 Keynote, I really want to have a taste of that left click to turn a project to Docker containers. So last Sunday I wake up early and armed with a big bowl of cafe latte start prepping my VM to try it. But as you could imagine, it didn't work on the first attempt, and that what I want to share to you.

DonovanBrown

Let's get started


I build a new VirtualBox virtual machine (VM) using an ISO of Windows 10, then install all the updates to get the anniversary edition. I afterward went to visualstudio.com to get the Visual Studio 2017 RC and install it with a few components: web, azure, etc. And next when looking for Docker for windows. Same here super easy just need to download and install the MSO from docker.com and voila... or not. Like specified in the Docker documentation the installer noticed that Hyper-V was not present in my environment and suggested installing it and reboot the machine. Until then, everything was going fine, but when Docker try to start, after the reboot, I got an error message:

Error creating machine: Error in driver during machine creation: This computer doesn't have VT-X/AMD-v enabled. Enabling it in the BIOS is mandatory

However, the setting was set, as you can see in the screenshot.

VT-xSetting

After a short investigation, I found that VirtualBox was not supporting nested virtualization, yet. Well, let's try another virtualization platform then.

First Success


I created a new VM under VMware Player this time and repeated all the steps as before. After the reboot, the little white whale in the system didn't prompt any error... Was it working? Let's try to create a Nginx Hello-world container to check. And the answer was YES!

2016-11-22_19-00-45

Docker and Azure


Now let's get serious. It's time to do the left click on the project and add Docker Project Support. Then press F5 to try it locally.

ERROR: for mystuff Cannot create container for service mystuff: C: drive is not shared. Please share it in Docker for Windows Settings Encountered errors while bringing up the project..

I got too excited, and I had skipped some reading... It's written black on white in the documentation that you need to share some drive to make it work.

ShareDrice

After that it when smoothly and in few minutes, I got my website to run in a Docker container accessible via localhost:32768. Great! Now to get it available on Azure, I need to create a registry where all the images will be saved. To do that simply left click again on the project, and select Publish. The guided dialog will popup and help you to create and deploy your things. An interesting point is before clicking the "Create" button you will have the option to export your template as a json file, very useful. But for now click Create, then Publish.

DockerPublishing

After a moment you should have a new browser window that will popup with your App now deployed in Azure.

DockerOnline

Wow! That was really a great experience. It's definitely a very simple process to get started. I really appreciate the Docker for Windows and Kitematic user interface. Of course, all the command-line are still available, but now I also have another option for the day where I feel more for clicking instead of typing.

DockerTools

Let's build it and ship it, from any platform to the cloud!