First steps with Docker and Kubernetes - Introduction (Matteo Pagani) - Wow, fantastic post to get started with Kubernetes the author mention that after reading this you won't be an expert... However, you will definitely know enough to be dangerous.
I published a video that explains how to UnZip files without any code by using Logic Apps. However, that solution didn't work for bigger files or different archive type. This post is to explain how you can use the Azure Function to cover those situations. This first iteration supports "Zip" files; all the code is available in my GitHub.
Prerequisites
To create the Azure Function, I will use the excellent Azure Function extension of Visual Studio Code. You don't "need" it. However, it makes thing very easy.
You can easily install the extension from Inside Visual Studio Code by clicking on the extension button in the left menu. You will also need to install the Azure Function Core Tools
Creating the Function
Once the extension installed, you will find a new button in the left menu. That opens a new section with four new option: Create New Project, Create Function, Deploy to Function App, and Refresh.
Click on the first option Create New Project. Select a local folder and a language; for this demo, I will use C#. This will create a few files and folder. Now let's create our Function. From the extension menu, select the second option Create Function. Create a Blob Trigger named UnzipThis into the folder we just created, and select (or create) Resource Group, Storage Account, and location in your subscription. After a few seconds, another question will pop asking the name of the container that our blob trigger monitors. For this demo, input-files is used.
Once the function is created you will see this warning message.
What that means is that to be able to debug locally we will need to set the setting AzureWebJobsStorage to UseDevelopmentStorage=true in the local.settings.json file. It will look like this.
The binding is attached to the container named input-files, from the storage account reachable by the connection "cloud5mins_storage". The real connectionString is in the local.settings.json file.
Now, let's put the code we need for our demo:
[FunctionName("Unzipthis")]
public static async Task Run([BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]CloudBlockBlob myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name}");
string destinationStorage = Environment.GetEnvironmentVariable("destinationStorage");
string destinationContainer = Environment.GetEnvironmentVariable("destinationContainer");
try{
if(name.Split('.').Last().ToLower() == "zip"){
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(destinationStorage);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(destinationContainer);
using(MemoryStream blobMemStream = new MemoryStream()){
await myBlob.DownloadToStreamAsync(blobMemStream);
using(ZipArchive archive = new ZipArchive(blobMemStream))
{
foreach (ZipArchiveEntry entry in archive.Entries)
{
log.LogInformation($"Now processing {entry.FullName}");
//Replace all NO digits, letters, or "-" by a "-" Azure storage is specific on valid characters
string valideName = Regex.Replace(entry.Name,@"[^a-zA-Z0-9\-]","-").ToLower();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(valideName);
using (var fileStream = entry.Open())
{
await blockBlob.UploadFromStreamAsync(fileStream);
}
}
}
}
}
}
catch(Exception ex){
log.LogInformation($"Error! Something went wrong: {ex.Message}");
}
}
UPDATED: Thanks to Stefano Tedeschi who found a bug and suggested a fix.
The source of our compressed file is defined in the trigger. To define the destination destinationStorage and destinationContainer are used. Their value are saved into local.settings.json. Then because this function only supports .zip file a little validation was required.
Next, we create an archive instance using the new System.IO.Compression library. We then create references to the storage account, blob, and container. It not possible to used second binding here because for one archive file you have a variable number of potential extracted files. The bindings are static; therefore we need to use the regular storage API.
Then for every file (aka entry) in the archive the code upload it to the destination storage.
Deploying
To deploy the function, from the Azure Function extension click on the third option: Deploy to Function App. Select your subscription and Function App name.
Now we need to configure our settings in Azure. By default, the local.setting are NOT used. Once again the extension is handy.
Under the subscription expand the freshly deployed Function App AzUnzipEverything, and right-click on Application Settings. Use Add New Setting to create cloud5mins_storage, destinationStorage and destinationContainer.
The function is now deployed and the settings are set, now we only need to create the blob storage containers, and we will be able to test the function. You can easily do that directly from the Azure portal (portal.azure.com).
You are now ready to upload a file into the input-files container.
Let's Code Together
This first iteration only supports "Zip" files. All the code is available on GitHub. Feel free to use it. If you would like to see or add support for other archive types join me on GitHub!.
In a video, please!
I also have a video of this post if you prefer.
I also have an extended version where I introduce more the Visual Studio Extension to work with Azure Function. And explain more details about the Azure Function V2.
Migrating Azure Functions from v1 (.NET) to v2 (Jeremy Likness) - A great story and an migration walkthrough. It's most likely that by following these steps we can only be successful with our migration.
Applied AI in Software Development (Afsana Atar) - This post is an excellent overview of whats AI at ten thousand feet in the air. Perfect if you are not a data scientist and would like to learn more about that trend.
My Twitch Live Coding Setup (Suz Hinton) - A great post from Suz (aka Noopkat) a long time technical streamer, who his both very smart and generous. Definitely, a post to read.
Sixty-five videos and one year later
One year already that I start sharing videos on YouTube, I've been blogging for many years and wanted to try something new. Creating videos seems like the next logical step. And this is how I decided to start sharing short videos to answer technical questions about Microsoft Azure on YouTube. This post, I will explain what I learned during the first year of my journey.
The Beginning
I started my YouTube channel in French. During the two first months, I published one video by week. I learned a lot through this period, how to prepare my code snippets, my files, and my screen. The biggest takeaway was that: preparation is the key. The more prepared you are, the more efficient you will be in front of the camera. I also discover that compare to writing a blog post; you need to pay more attention, way earlier in the process, to the tiny details.
It may sound like a cliché, but it was challenging to watch myself. But it's the best way to improve. You need to accept those mistakes and promise yourself that next time, you will do better. It's normal and okay not to be perfect.
Let's be bilingual
One thing that I was looking forward was to be able to share my videos with all my clients/ friends/ community members. However, I couldn't because while some are perfectly comfortable in French (and only French), others only understand English. And this is why I decided to record all my videos in both French and English. I knew that would mean doubling the workload, so I adapted my schedule to publish every second week.
Doing all does videos I found my beat, my style, the things I like and dislike. I also spend a lot of time learning the YouTube platform through books and by watching other YouTubers. It's such an incredible resourceful community!
A New Camera
The software (Camtasia from TechSmith) I was using, and that I'm still using today, can record both your screen and your webcam. It keeps all the tracks separated, and this is great because you can change the size and position of the mortise in post-production.
My challenge was that I wanted to record both in 1080P (full HD) because during my intros and conclusion when the webcam input fill the screen I want good image quality. However, the software (or more likely my PC), did not let me record in 1080P (full HD) the screen and the webcam input.
To upgrade the quality of my recordings, I decided to use my DSLR instead of my webcam. Of course, it represents more work in post-production because now I need to synchronize the two videos sources, but now everything is 1080.
The famous algorithm
The more I was studying my statistics and reading comments (thank you all for your constructive comments by the way), I started to understand that having all my videos in French and English on the same channel was not the best idea.
First, for the YouTube algorithm, it was very confusing because it couldn't identify the video's language. Therefore, it's harder to do its job and make recommendations.
Secondly, it didn't cross my mind at the beginning, because I understand both languages, but for a subscriber that understand only one of the language, it's was very confusing and not very pleasing to have all those "others" un-useful videos in the feed.
After a very long (and hard) reflection, I decided to create a second channel and "move" all my French content to that channel. I say "move" but you cannot "move" contents on YouTube, and that was the principal reason why I was hesitating to split my channel. By creating a new channel, and re-uploading my older (aka French) videos I was losing all my stats (views, subscribers), and references.
The effect was immediate on the English channel as the views and subscribers exploded a week after. And on the other side, my brand new French channel didn't really catch-up. Never the less I intend to continue to create French content. ;)
Today's Status
As this post gets published, I have a cumulative total of sixty-five videos online and more than a thousand subscribers. I publish every second week, usually on Thursday, two videos on the same topic one in French on fr.cloud5mins.com and one in English on cloud5mins.com. I have a fantastic community super supportive, asking questions and suggesting topics.
I'm very thankful for all that positive feedback. To help me you can subscribe to my YouTube channel of course, or become one of my Patreon at patreon.com/fboucheros
What's Next
I'm definitely continuing to create videos. I plan to start doing some streaming on Twitch, where I will building cloud solutions.
Rust Governance: Scaling Empathy (Manish Goregaokar) - A friend recently introduced me to Rust. This post assumed we all already familiar to it, and talk about the current problems of this new language and potential solutions.
I was definitely not expecting that, when I picked up this book, but I am happy I did.
This "self-help" book is filled with a ton of comedy and I appreciated it. I felt like my best friend was talking about a serious topic but because he was in a crazy good mood was just having a good time telling me his story. Simple and real. It leaves you with a lot to think about.
Azure Automation of A-to-Z, Part I (dbakevlar) - This is a great post very instructive, that explains how and why we should structure our scripts.
Azure Policies (Gregor Suttie) - One of the best tools to he lp us with the governance in Azure is the policies. This post is a nice introduction to how it could help.
Moving your ASP.NET applications to the Microsoft Cloud (Premier Developer) - If you are thinking to migrate to the cloud, it's important to plan your migration. This post is the perfect point to get started, it contains references to deeper books and documents.
The Rise of Microsoft Visual Studio Code (Lyn Levenick) - Cool statistics about editor usage. Not sure of the real correlation with the editor used and the skill level, but it's still an interesting coincidence.
Stream Deck Tricks for Streamers… and Muggles too! (Jeff) - Fantastic post that explains so much why that little thing can save you so much pain. As THE day when I'm starting to stream get closer and closer... This is gold.