Styling Social Media Icon Lists in CSS (Mark Heath) - Yeah right, we can read CSS and probably hack some stuff... But it's excellent to learn how to do simple things the good way. And this post shows exactly that.
Power BI Desktop January Feature Summary (Amanda Cofsky) - It looks like some team did not take a lot of time-off during the Holidays... So many great new functionalities; it's a fantastic way to start the year.
I recently started a French YouTube channel. Quickly, I got a message asking to add English sub-title, and got also a suggestion to leverage Azure Logic App and some Cognitive Services to help me in that task. I really liked the idea, so I gave it a shot. I recorded myself and in twenty minutes I was done. Even though, it was not the success I was hoping for, the application works perfectly. It's just that speaking in French with a lot of English technical word was a little bite too hard for the Video Indexer. However, If you are speaking only one language in your video that solution would work perfectly. In this post, I will show you how to create that Logic App with Azure Video Indexer and Cognitive Services.
The Idea
Once a video is dropped in an OneDrive folder (or any file system accessible from Azure), a Logic App will get triggered and uploads the file to the Azure Video Indexer, generate a Video Text Tracks (VTT) file, and save this new file in another folder. A second Logic App will get started and use the Translator Text API from Azure Cognitive Service to translate the VTT file, and save it into the final folder.
The Generation
Before getting started, you will need to create your Video Indexer API. To do this, login to the Video Indexer developer portal, and subscribe at the Video Indexer APIs - Production in the Product tab. You should then get your API keys.
To get more detail on the subscription refer to the documentation. To know the names, parameters, code sample to all the methods available in your new API, click on APIs tab.
Now let's create our first Logic App. I always prefer to start with a blank template, but take what fits you. Any Online file system's trigger will do, in this case I'm using the When a file is created from OneDrive. I got some issue with the trigger. It was not always getting fired by a new file. I tried the When a file is modified trigger, but it didn't solve the problem. If you think, you know what I was doing wrong feel free to leave a comment :).
First reel action is to upload the file to the Azure Video Indexer. We can to that ery easily by using the method Upload video and and index, passing the name and content from the trigger.
Of course, the longer is the video the longer will be the process, so we will need to wait. A way to do that is by adding a waiting loop. Will use the method Get processing state from the Video Indexer and loop until the status is processed. To slow down your loop just add a wait action and set it at tree or five minutes.
When the file is completely processed, it will be time to retrieve the VTT file. This is done in two simple step. First, we will get the URL by calling the method Get the transcript URL, then with a simple HTTP GET we will download the file. The last thing we will need to do will be to save it in a folder where our second Logic App will be watching for new drop.
In the visual designer, the Logic App should look to this.
The Translation
The second Logic App is very short. Once again, it will get triggered by a new file trigger in our OneDrive Folder. Then it will be time to call our Translator Text API from Azure Cognitive Service. That's to the great Logic App interface it's very intuitive to fill all the parameter for our call. Once we got the translation, we need to save it into our final destination.
The Logic App should look like this.
Conclusion
It was much easier than I expected. I really like implementing those integration projects with Logic App. It's so easy to "plug" all those APIs together with this interface. And yes like I mentioned in the introduction the result was not "great". I run test with video purely in English (even with my accent) or only in French (no mix) and the result was really good. So I think the problem is really the fact that I mix French and English. I could improve the Indexer by spending time providing files so the service could understand better my "Franglish". However, in twenty minutes, I'm really impressed by the way, in turned out. If you have idea on how to improve this solution, or if you have some questions, feel free to leave a comment. You can also watch my French YouTube video.
Take a Break with Azure Functions (Justin Clareburt (MSFT)) (Justin Clareburt) - I know the Holidays are passed, but it's always time to learn Azure Function. Do yourself a favor the next rainy or super cold weekend... Follow this "program".
What part of your job can you automate? (Kevin Bah) - With time, all developers accumulate "tools" and these days with all those scripts and API capabilities... It's not a question of how can we do something, but more: where to do it...
How to handle BLANK in DAX measures (Marco Russo) - This great post will helps us to think about if blanks could by part of our data and how to manage them.
Is jQuery still relevant? (Remy Sharp) - I've really appreciated this post that brings numbers to sustain his answer.
Lately, I've been having some trouble when deploying from Visual Studio. First, I didn't care since I didn't have time to investigate and also because most of the time using PowerShell or Azure CLI. However, this issue was not usual of Visual Studio, so I decided to see what was the problem and try to fix it.
The Problem
In a solution, I added a simple Azure Resource Group deployment project just like this one.
Then when I try to right-click and do a Deploy...
I was having this error message:
- The following parameter values will be used for this operation:
- Build started.
- Project "TestARMProject.deployproj" (StageArtifacts target(s)):
- Project "TestARMProject.deployproj" (ContentFilesProjectOutputGroup target(s)):
- Done building project "TestARMProject.deployproj".
- Done building project "TestARMProject.deployproj".
- Build succeeded.
- Launching PowerShell script with the following command:
- 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\Deploy-AzureResourceGroup.ps1' -StorageAccountName '' -ResourceGroupName 'TestARMProject' -ResourceGroupLocation 'eastus' -TemplateFile 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.json' -TemplateParametersFile 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.parameters.json' -ArtifactStagingDirectory '.' -DSCSourceFolder '.\DSC'
-
-
- Account : Frank Boucher
- SubscriptionName : My Subscription
- SubscriptionId : xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
- TenantId : xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
- Environment : AzureCloud
-
- VERBOSE: Performing the operation "Replacing resource group ..." on target "".
- VERBOSE: 7:06:33 - Created resource group 'TestARMProject' in location 'eastus'
-
- ResourceGroupName : TestARMProject
- Location : eastus
- ProvisioningState : Succeeded
- Tags :
- TagsTable :
- ResourceId : /subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/TestARMProject
-
- Get-ChildItem : Cannot find path
- 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.json' because it does not
- exist.
- At D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\Deploy-AzureResourceGroup.ps1:108
- char:48
- + ... RmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseNa ...
- + ~~~~~~~~~~~~~~~~~~~~~~~~~~~
- + CategoryInfo : ObjectNotFound: (D:\Dev\local\Te...zuredeploy.json:String) [Get-ChildItem], ItemNotFound
- Exception
- + FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.GetChildItemCommand
-
- Deploying template using PowerShell script failed.
- Tell us about your experience at https://go.microsoft.com/fwlink/?LinkId=691202
Apparently the script is failling with Get-ChildItem because my script is missing?! I looked in the folder D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject, and indeed the files are missing! Fixing this is in fact really simple fortunately.
The Solution
The problem is very simple when Visual Studio is building the project, it doen't copies the script files in the build folder (in this case bin\Debug\Staging\). In fact, Visual Studio is doing exactly as we are telling it. Let see the build command for those files. Right-click and select Properties (or Alt+Enter) while azuredeploy.json is selected.
See the Build Action is set at None change that to Content (for all the scripts). Save and Deploy again.
Can I Have My CPU Back Visual Studio? (Rion Williams) - This is an interesting and easy solution to an annoying problem. Hopefully, a fix is coming soon.
I've been blogging for about ten years with you now on this blog, and it's been a pleasure. More recently I started, quietly, a French blog named: Cloud en Français, feel free to have a look.
Today, I'm super excited to share with you my new project: A YouTube channel with original content published every week... in French! Every Wednesday I will be publishing a five-minute video to answer a problem or a question.
Je blogue depuis une dizaine d'années avec vous sur ce blogue, et c'est toujours un plaisir. Plus récemment j'ai commencé, plus silencieusement, un blogue français nommé: Cloud en Français, n'hésitez pas à jeter un coup d'œil.
Aujourd'hui, je suis super excité de partager avec vous mon nouveau projet: une chaîne YouTube avec du contenu original publié chaque semaine... en français! Chaque mercredi, je vais publier une vidéo d'environ cinq minutes pour répondre à un problème ou une question.
Venez me voir en ligne, abonnez-vous, posez des questions ...
Azure Application Architecture Guide (Mike Wasson) - A free book (pdf only) with all the best of the AzureCAT team? You really don't want to miss that opportunity.
Understanding Azure Event Grid ( Jason Roberts) - Nice little post that introduces event grid, differentiate it from the service bus, and quickly go over the pricing.
Cloud en 5 minutes (Frank Boucher) - If French is your primary language, I just start a Youtube channel where every Wednesday I will publish a new video. // Si le français est votre langue première, je viens de lancer une chaine Youtube où à chaque mercredi je publirai un nouveau vidéo.
The other day, a friend asked me how he could add some functionality to an existing application without having access to the code. It the perfect case to demo some Azure Functions capability, so I jumped on the occasion. Because my friend is a Node.js developer on Linux, and I knew it was supported, I decided to try that combination. I know Node, but I'm definitely not and expert since I don't practice very often.
This post is my journey building that demo. I was out of my comfort zone, coding in Node and working on a Linux machine, but not that far... Because these days, you can "do some Azure" from anywhere.
The Goal
Coding an Azure Function that will connect to an SQL Database (it could be any data source). Using Node.js and tools available on Unbuntu.
Note: In this post, I will be using Visual Studio Code, but you could also create your function directly in the Azure Portal or from Visual Stusio.
Getting Started
If you are a regular reader of this blog, you know how I like Visual Studio Code. It's a great tool available on Mac Linux and Windows and gives you the opportunity to enjoy all its feature from anywhere feeling like if you were in your cozy and familiar environment. If VSCode is not already installed on your machine, go grap your free version on http://code.visualstudio.com.
Many extensions are available for VSCode, and one gives us the capability to code and deploy Azure Function. To install it, open VSCode and select the extension icon and search for Azure Function; it's the one with the yellow lighting and the blue angle brackets.
Create the Azure Function
To get started let's great an Azure Function project. By sure to be in the folder where you wish to create your Function App. Open the Command Pallette (Ctrl + Shift + p) and type Azure Function. Select Azure Functions: Create New Project. That will add some configuration files for the Functions App.
Now Let's create a Function. You could reopen again the Command Palette and search for Azure Function: Create Function, but let's use the UI this time. At the bottom left of the Explorer section, you should see a new section called AZURE FUNCTIONS. Click on the little lighting to Create a new Function.
After you specify the Function App name, the Azure subscription and other little essential, a new folder will be added in your folder structure, and the function is created. The code of our function is in the file Index.js. At the moment, of writing this post only Javascript is supported by the VSCode extension.
Open the file index.js and replace all its content by the following code.
var Connection = require('tedious').Connection;
var Request = require('tedious').Request
var TYPES = require('tedious').TYPES;
module.exports = function (context, myTimer) {
var _currentData = {};
var config = {
userName: 'frankadmin',
password: 'MyPassw0rd!',
server: 'clouden5srv.database.windows.net',
options: {encrypt: true, database: 'clouden5db'}
};
var connection = new Connection(config);
connection.on('connect', function(err) {
context.log("Connected");
getPerformance();
});
function getPerformance() {
request = new Request("SELECT 'Best' = MIN(FivekmTime), 'Average' = AVG(FivekmTime) FROM RunnerPerformance;", function(err) {
if (err) {
context.log(err);}
});
request.on('row', function(columns) {
_currentData.Best = columns[0].value;
_currentData.Average = columns[1].value;;
context.log(_currentData);
});
request.on('requestCompleted', function () {
saveStatistic();
});
connection.execSql(request);
}
function saveStatistic() {
request = new Request("UPDATE Statistic SET BestTime=@best, AverageTime=@average;", function(err) {
if (err) {
context.log(err);}
});
request.addParameter('best', TYPES.Int, _currentData.Best);
request.addParameter('average', TYPES.Int, _currentData.Average);
request.on('row', function(columns) {
columns.forEach(function(column) {
if (column.value === null) {
context.log('NULL');
} else {
context.log("Statistic Updated.");
}
});
});
connection.execSql(request);
}
context.done();
};
The code just to demonstrate how to connect to an SQL Database and do not represent the best practices. At the top, we have some declaration the used the package tedious; I will get back to that later. A that, I've created a connection using the configuration declared just before. Then we hook some function to some event. On connection connect the function getPerformance() is called to fetch the data.
On request row event we grab the data and do the "math", then finally on requestCompleted we call the second sub-function that will update the database with the new value. To get more information and see more example about tedious, check the GitHub repository.
Publish to Azure
All the code is ready; it's now time to publish our function to Azure. One more time you could to that by the Command Palette, or the Extension menu. Use the method of your choice and select Deploy to Function App. After a few seconds only our Function will be deployed in Azure.
Navigate to portal.azure.com and get to your Function App. If you try to Run the Function right now, you will get an error because tedious is not recognized.
Install the dependencies
We need to install the dependencies for the Function App, in this case tedious. A very simple way is to create a package.json file and to use the Kudu console ton install it. Create a package.json file with the following json in it:
Open the Kudu interface. You can reach it by clicking on the Function App then the tab Platform features and finally Advanced tools (Kudu). Kudu is also available directly by the URL [FunctionAppNAme].scm.azurewebsites.net (ex: https://clouden5minutes.scm.azurewebsites.net ). Select the Debug consoleCMD. Than in the top section navigate to the folder home\site\wwwroot. Drag & drop the package.json file. Once the file is uploaded, type the command npm install to download and install all the dependencies declared in our file. Once it all done you should restart the Function App.
Wrapping up & my thoughts
There it is, if you go back now to your Function and try to execute it will work perfectly. It's true that I'm familiar with Azure Function and SQL Database. However, for a first experience using Ubuntu and Node.js in the mix, I was expecting more resistance. One more time VSCode was really useful and everything was done with ease.
For those of you that would like to test this exact function, here the SQL code to generate what will be required for the database side.
CREATE TABLE RunnerPerformance(
Id INT IDENTITY(1,1) PRIMARY KEY,
FivekmTime INT
);
CREATE TABLE Statistic(
Id INT IDENTITY(1,1) PRIMARY KEY,
BestTime INT,
AverageTime INT
);
INSERT Statistic (BestTime, AverageTime) VALUES (1, 1);
DECLARE @cnt INT = 0;
WHILE @cnt < 10
BEGIN
INSERT INTO RunnerPerformance (FivekmTime)
SELECT 9+FLOOR((50-9+1)*RAND(CONVERT(VARBINARY,NEWID())));
SET @cnt = @cnt + 1;
END;
Azure SQL Databases Disaster Recovery 101 (Xiaochen Wu) - One of the best posts a read on the topic. I'm not sure if I can sale the alien part to a client, but other than that very clear. lol
Improvements to Azure Functions in Visual Studio ( Justin Clareburt ) - Wow. Azure Functions may evolve quickly but the tools are following, and this is impressive. Check out all the features that would be included in the next version (currently accessible in preview).
Creating a Minimal ASP.NET Core Windows Container (Jeffrey T. Fritz) - This is an amazing post that shows how to optimize our containers. It may be about Asp.Net Core but it applies to any containers. It also shows one of the big improvements with .Net Core...
Why I won't be switching to VSCode any time soon - A geek with a hat (Swizec Teller) - I was surprised when I saw the title of this post, how could you not like VsCode? However, after reading it, I understand. Through preferences are personal, VSCode does a lot and needs a bit more than few minutes to be tame. I got also overwhelmed when I started using it. The secret is the documentation... and sometimes. Then it becomes whatever you want.
Writing tests in Postman (joyce) - With all the connected things and all the API in our system, this post shows a brilliant and simple way to test all those external calls.
How we set up daily Azure spending alerts and saved $10k (Mike Larah) - Cloudyn is definitely a must in many cases. I need to see how it manages CSP subscriptions and limited access for clients. This post also shared how to integrate it in Slack, pretty cool.
Azure Event Grid vs Azure IoT Hub: Which is better for IoT? (Chris Pietschmann) - This great post explains both services and then their differences. Both can do most of the time, but when it becomes serious, which one is the best for your scenario?
Azure Virtual Machines Anatomy (Vincent-Philippe Lauzon) - This post is an excellent biopsy of a virtual machine, it explains every part one by one.
User accounts made easy with Azure (Andrew B Hall) - Excellent post showings how easy it is to integrate user management in our app. I wish I had that when I started my last portal...
My First Year as an MVP, part 1 (Jen Kuntz) - Interesting post. It feels soooooo familiar, and yet so far now. I look forward to meeting you in March fellow Canadian MVP.
7 Hidden Gems in Visual Studio 2017 (Visual Studio Team) - This post shares not the usual tips. Seriously, how many you already knew before reading the article?
Social Presentations with Mark Rendle (Carl and Richard) - That look like a very promising tool... except for the name that I continually forget... (but that's probably just me). I will definitely have a to the Github repo.
What is a Cloud Developer Advocate? (Jeremy Likness) - Great article. A few of my friends joined that team, and even though I already had this talk with them, now I have a better picture.
Since its availability, I try to use Power Bi as often as I can. It is so convenient, to build visual to explain data coming from a ton of possible data source. And it's a breeze to share it with clients, or colleagues. However, this post is not an info-commercial about Power Bi, it's about sharing some challenges I got trying to prepare a report and how I fix it.
The Data Source
The context is simple, all transactions are in one table, and I have a second table with a little information related to clients. To that, I personally like to add a calendar table, because it simplifies my life.
For this report, it is very important to but a slicer by Client.
The Goal
I needed to have one report that shows for every customer three different Year To Date (YTD) total. The classic YTO, a YTD but when the beginning of the years is, in fact, when the client started is enrolment, and the last one was a rolling twelve.
It looks pretty simple, and in fact, it's not that complicated. Let's examine each total formula one by one.
Classic Year To Date Total
Before we get started, it's a good practice to reuse Mesure to simplify our formula, and to explicit expression. Let's create a Measure for the Total Sales, that will be used inside our other formulas.
TotalSales = SUM('Sales'[Total])
Now the Year To Date, is simple to add by adding a New Measure and entering the formula:
If you activate the Preview feature of Power Bi, it could be even easier. Look for the button New Quick Measure and select the Total Year To Date, fill up the form and voila!
The generated formula looks a bit different because Power Bi managed the error in h expression.
TotalYTD =
IF(
ISFILTERED('Calendar'[Date]),
ERROR("Time intelligence quick measures can only be grouped or filtered by the Power BI-provided date hierarchy."),
TOTALYDT([TotalSales], 'Calendar'[Date].[Date])
)
Anniversary Year To Date Total
I spent more time then I was expecting on that one. Because in the Online DAX documentation it is said that the formula TOTALYDT accept a third parameter to specify the end of the year. So if I had only one client, with a fix anniversary date (or a fiscal year) this formula will work assuming, the special date is April 30th.
TOTALYDT([TotalSales], 'Calendar'[Date], "04-30")
However, in my case, the ending date changes at with every client. I'm sure right now you are thinking that's easy Frank just set a variable and that it! Well, it won't work. The thing is the formula is expecting a static literal, no variable aloud even if it returns a string.
The workaround looks at first a bit hard, but it's not that complex. We need to write our own YTD formula. Let's look at the code, and I will explain it after.
Anniversary YTD =
VAR enddingDate = LASTDATE(Company[EnrolmentDate])
VAR enddingMonth = MONTH ( enddingDate )
VAR enddingDay = DAY ( enddingDate )
VAR currentDate = MAX ( Calendar[Date] )
VAR currentYear = YEAR ( currentDate )
VAR enddingThisYear = DATE ( currentYear, enddingMonth, enddingDay )
VAR enddingLastYear = DATE ( currentYear - 1, enddingMonth, enddingDay )
VAR enddingSelected = IF ( enddingThisYear < currentDate, enddingThisYear, enddingLastYear )
RETURN
CALCULATE (
[TotalSales] ,
DATESBETWEEN(Calendar[Date],enddingSelected,currentDate)
)
First lines are all variable's declaration. They are not required, but I found it easier to understand when things are very explicit. Since I'm slicing my report by companies putting the LASTDATE is just a way not avoid errors. It should have only one record. Then we extract year, month, and day.
The last variable enddingSelected identify if the anniversary (the end date) is pasted or not in the curent calendar year.
The calculate function is returning the TotalSales between the last anniversary date and today.
Rolling twelve Total
For the last formula, the rolling twelve we will re-use the previous code, but in a simpler way since the end date is always yesterday.
Rolling 12 Total =
VAR todayDate = TODAY()
VAR todayMonth = MONTH ( todayDate )
VAR todayDay = DAY ( todayDate )
VAR todayYear = YEAR ( todayDate )
VAR enddingLastYear = DATE ( todayYear - 1, todayMonth, TodayDayVar +1)
RETURN
CALCULATE (
[TotalSales] ,
DATESBETWEEN( Calendar[Date], enddingLastYear, todayDate)
)
Wrap it up
I definitely learned a few things with that Power Bi session, but it turns out to be pretty easy. Again, leave a comment or send me an email if you have any comments or questions I will be very happy to ear from you.