Showing posts sorted by date for query Deploy. Sort by relevance Show all posts
Showing posts sorted by date for query Deploy. Sort by relevance Show all posts

Full-Stack Azure Deployment Made Easy: Containers, Databases, and More

Automating deployments is something I always enjoy. However, it's true that it often takes more time than a simple "right-click deploy." Plus, you may need to know different technologies and scripting languages.

(Version française ici)

But what if there was a tool that could help you write everything you need—Infrastructure as Code (IaC) files, scripts to copy files, and scripts to populate a database? In this post, we'll explore how the Azure Developer CLI (azd) can make deployments much easier.

What do we want to do?

Our goal: Deploy the 2D6 Dungeon App to Azure Container Apps.

This .NET Aspire solution includes:

  • A frontend
  • A data API
  • A database

Aspire resources schema


The Problem

In a previous post, we showed how azd up can easily deploy web apps to Azure.

If we try the same command for this solution, the deployment will be successful—but incomplete:

  • The .NET Blazor frontend is deployed perfectly.
  • However, the app fails when trying to access data.
  • Looking at the logs, we see the database wasn't created or populated, and the API container fails to start.

Let's look more closely at these issues.

The Database

When running the solution locally, Aspire creates a MySQL container and executes SQL scripts to create and populate the tables. This is specified in the AppHost project:

var mysql = builder.AddMySql("sqlsvr2d6")
                   .WithLifetime(ContainerLifetime.Persistent);
                
var db2d6 = mysql.AddDatabase("db2d6");

mysql.WithInitBindMount(source: "../../database/scripts", isReadOnly: false);

When MySQL starts, it looks for SQL files in a specific folder and executes them. Locally, this works because the bind mount is mapped to a local folder with the files.

However, when deployed to Azure:

  • The mounts are created in Azure Storage Files
  • The files are missing!

The Data API

This project uses Data API Builder (dab). Based on a single config file, a full data API is built and hosted in a container.

Locally, Aspire creates a DAB container and reads the JSON config file to create the API. This is specified in the AppHost project:

var dab = builder.AddDataAPIBuilder("dab", ["../../database/dab-config.json"])
                .WithReference(db2d6)
                .WaitFor(db2d6);

But once again, when deployed to Azure, the file is missing. The DAB container starts but fails to find the config file.

Logs of DAB failing to start


The Solution

The solution is simple: the SQL scripts and DAB config file need to be uploaded into Azure Storage Files during deployment.

You can do this by adding a post-provision hook in the azure.yaml file to execute a script that uploads the files. See an example of a post-provision hook in this post.

Alternatively, you can leverage azd alpha features: azd.operations and infraSynth.

  • azd.operations extends the provisioning providers and will upload the files for us.
  • infraSynth generates the IaC files for the entire solution.

💡Note: These features are in alpha and subject to change.

Each azd alpha feature can be turned on individually. To see all features:

azd config list-alpha

To activate the features we need:

azd config set alpha.azd.operations on
azd config set alpha.infraSynth on

Let's Try It

Once the azd.operation feature is activated, any azd up will now upload the files into Azure. If you check the database, you'll see that the db2d6 database was created and populated. Yay!

However, the DAB API will still fail to start. Why? Because, currently, DAB looks for a file, not a folder, when it starts. This can be fixed by modifying the IaC files.

One Last Step: Synthesize the IaC Files

First, let's synthesize the IaC files. These Bicep files describe the required infrastructure for our solution.

With the infraSynth feature activated, run:

azd infra synth

You'll now see a new infra folder under the AppHost project, with YAML files matching the container names. Each file contains the details for creating a container.

Open the dab.tmpl.yaml file to see the DAB API configuration. Look for the volumeMounts section. To help DAB find its config file, add subPath: dab-config.json to make the binding more specific:

containers:
    - image: {{ .Image }}
      name: dab
      env:
        - name: AZURE_CLIENT_ID
          value: {{ .Env.MANAGED_IDENTITY_CLIENT_ID }}
        - name: ConnectionStrings__db2d6
          secretRef: connectionstrings--db2d6
      volumeMounts:
        - volumeName: dab-bm0
          mountPath: /App/dab-config.json
          subPath: dab-config.json
scale:
    minReplicas: 1
    maxReplicas: 1

You can also specify the scaling minimum and maximum number of replicas if you wish.

Now that the IaC files are created, azd will use them. If you run azd up again, the execution time will be much faster—azd deployment is incremental and only does "what changed."

The Final Result

The solution is now fully deployed:

  • The database is there with the data
  • The API works as expected
  • You can use your application!
2D6 Dungeon App deployed


Bonus: Deploying with CI/CD

Want to deploy with CI/CD? First, generate the GitHub Action (or Azure DevOps) workflow with:

azd pipeline config

Then, add a step to activate the alpha feature before the provisioning step in the azure-dev.yml file generated by the previous command.

- name: Extends provisioning providers with azd operations
  run: azd config set alpha.azd.operations on     

With these changes, and assuming the infra files are included in the repo, the deployment will work on the first try.

Conclusion

It's exciting to see how tools like azd are shaping the future of development and deployment. Not only do they make the developer's life easier today by automating complex tasks, but they also ensure you're ready for production with all the necessary Infrastructure as Code (IaC) files in place. The journey from code to cloud has never been smoother!

If you have any questions or feedback, I'm always happy to help—just reach out on your favorite social media platform.

In Video

Here the video version of this blog post.


References


Reading Notes #644

This post gathers my recent reading notes on artificial intelligence, programming, and a few inspiring podcasts. It includes links to articles, tutorials, and fascinating discussions. Whether you're interested in the latest AI developments, .NET tools, or modern architectures, there's plenty here to spark your curiosity. 


Happy reading!

AI


Programming


Podcasts

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank

Visual Countdown Days Until [a date]

During the holidays, I embarked on a fun project to create a visual countdown for important dates. Inspired by howmanysleeps and hometime from veebch, I wanted to build a countdown that didn't rely on Google Calendar. Instead, I used a Raspberry Pi Pico and some custom code to achieve this.

💾 You can find the full code on GitHub


Raspberry Pi pico and the light using custom colors

What It Is

This project consists of two main parts:

  • Python code for the Raspberry Pi Pico
  • A .NET website to update the configuration, allowing you to set:
    • The important date
    • Two custom colors or random ones
    • The RGB values for the custom colors


screenshot of the configuration website

What You Need

How to Deploy the Configuration Website

After cloning the repo, navigate to the src/NextEvent/ folder and use the Azure Developer CLI to initialize the project:

azd init

Enter a meaningful name for your resource group in Azure. To deploy, use the deployment command:

azd up

Specify the Azure subscription and location when prompted. After a few minutes, everything should be deployed. You can access the URL from the output in the terminal or retrieve it from the Azure Portal.

How to Set Up the Raspberry Pi Pico

Edit the config.py file to add your Wi-Fi information and update the number of lights on your light strip.

You can use Thonny to copy the Python code to the device. Copy both main.py and config.py to the Raspberry Pi Pico.

How It Works

  • The website creates a JSON file and saves it in a publicly accessible Azure storage.
  • When the Pi is powered on, it will:
    • Turn green one by one all the lights of the strip
    • Change the color of the entire light strip a few times, then turn it off
    • Try to connect to the Wi-Fi
    • Retrieve the timezone, current date, and settings from the JSON file
    • If the important date is within 24 days, the countdown will be displayed using random colors or the specified colors.
    • If the date has passed, the light strip will display a breathing effect with a random color of the day.

The Code on the Raspberry Pi Pico

The main code for the Raspberry Pi Pico is written in Python. Here's a brief overview of what it does:

  1. Connect to Wi-Fi: The connect_to_wifi function connects the Raspberry Pi Pico to the specified Wi-Fi network.
  2. Get Timezone and Local Time: The get_timezone and get_local_time functions fetch the current timezone and local time using online APIs.
  3. Fetch Light Settings: The get_light_settings function retrieves the important date and RGB colors from the JSON file stored in Azure.
  4. Calculate Sleeps Until Special Day: The sleeps_until_special_day function calculates the number of days until the important date.
  5. Control the LED Strip: The progress function controls the LED strip, displaying the countdown or a breathing effect based on the current date and settings.

The Configuration Website

The configuration website is built in C#. It's a Blazor server webapp, and I used .NET Aspire to make it easy to run it locally. The UI uses FluentUI-Blazor so it looks pretty, without effort. 

The website allows you to update the settings for the Raspberry Pi Pico. You can set the important date, choose custom colors, and save these settings to a JSON file in Azure storage.

Little Extra

The website is deployed in Azure Container App with a minimum scaling to zero to save on costs. This may cause a slight delay when loading the site for the first time, but it will work just fine and return to "dormant" mode after a while.

I hope you enjoyed reading about my holiday project! It was a fun and educational experience, and I look forward to working on more projects like this in the future.

What's Next?

Currently the project does a 24 days countdown (inspired from the advent calendar). I would like to add a feature to allow the user to set the number of days for the countdown. I would also like to add the possibility to set the color for the breathing effect (or keep it random) when the important date has passed. And lastly, I would like to add the time of the day when the light strip should turn on and off, because we all have different schedule 😉 .

Last thoughts

I really enjoyed doing this project. It was a fun way to learn more about the Raspberry Pi Pico, micro-Python (I didn't even know it was a thing), and FluentUI Blazor. I hope you enjoyed reading about it and that it inspired you to create your own fun projects. If you have any questions or suggestions, feel free to reach out, I'm fboucheros on most socials.

~Frank

It turns out, it's not difficult to remove all passwords from our Docker Compose files

I used to hardcode my password in my demos and code samples. I know it's not a good practice, but it's just for demo purposes, it cannot be that dramatic, right? I know there are proper ways to manage sensitive information, but this is only temporary! And it must be complicated to remove all the passwords from a deployment... It turns out, IT IS NOT difficult at all, and that will prevent serious threats.

In this post, I will share how to remove all passwords from a docker-compose file using environment variables. It's quick to setup and easy to remember. For production deployment, it's better to use secrets, because environment variables will be visible in logs. That said, for demos and debugging and testing, it's nice to see those values. The code will be available on GitHub. This deployment was used for my talks during Azure Developers .NET Days: Auto-Generate and Host Data API Builder on Azure Static Web Apps and The most minimal API code of all... none

The Before Picture

For this deployment, I used a docker-compose file to deploy an SQL Server in a first container and Data API Builder (DAB) in a second one. When the database container starts, I run a script to create the database tables and populate them.

services:

  dab:
    image: "mcr.microsoft.com/azure-databases/data-api-builder:latest"
    container_name: trekapi
    restart: on-failure
    volumes:
      - "./startrek.json:/App/dab-config.json"
    ports:
      - "5000:5000"
    depends_on:
      - sqlDatabase

  sqlDatabase:
    image: mcr.microsoft.com/mssql/server
    container_name: trekdb
    hostname: sqltrek
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: "1rootP@ssword"
    ports:
      - "1433:1433"
    volumes:
      - ./startrek.sql:/startrek.sql
    entrypoint:
      - /bin/bash
      - -c
      - |
        /opt/mssql/bin/sqlservr & sleep 30
        /opt/mssql-tools/bin/sqlcmd -U sa -P "1rootP@ssword" -d master -i /startrek.sql
        sleep infinity

As we can see, the password is in clear text twice, in the configuration of the database container and in the parameter for sqlcmd when populating the database. Same thing for the DAB configuration file. Here the data-source node where the password is in clear text in the connection string.

"data-source": {
 	"database-type": "mssql",
	"connection-string": "Server=localhost;Database=trek;User ID=sa;Password=myPassword!;",
	"options": {
		"set-session-context": false
	}
}

First Pass: Environment Variables

The easiest password instance to remove was in the sqlcmd command. When defining the container, an environment variable was used... Why not use it! To refer to an environment variable in a docker-compose file, you use the syntax $$VAR_NAME. I used the name of the environment variable MSSQL_SA_PASSWORD to replace the hardcoded password.

/opt/mssql-tools/bin/sqlcmd -U sa -P $$MSSQL_SA_PASSWORD -d master -i /startrek.sql

Second Pass: .env File

That's great but the value is still hardcoded when we assign the environment variable. Here comes the environment file. They are text files that holds the values in key-value paired style. The file is not committed to the repository, and it's used to store sensitive information. The file is read by the docker-compose and the values are injected. Here is the final docker-compose file:

services:

  dab:
    image: "mcr.microsoft.com/azure-databases/data-api-builder:latest"
    container_name: trekapi
    restart: on-failure
    env_file:
      - .env
    environment:
      MY_CONN_STRING: "Server=host.docker.internal;Initial Catalog=trek;User ID=sa;Password=${SA_PWD};TrustServerCertificate=True"
    volumes:
      - "./startrek.json:/App/dab-config.json"
    ports:
      - "5000:5000"
    depends_on:
      - sqlDatabase

  sqlDatabase:
    image: mcr.microsoft.com/mssql/server
    container_name: trekdb
    hostname: sqltrek
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: ${SA_PWD}
    env_file:
      - .env
    ports:
      - "1433:1433"
    volumes:
      - ./startrek.sql:/startrek.sql
    entrypoint:
      - /bin/bash
      - -c
      - |
        /opt/mssql/bin/sqlservr & sleep 30
        /opt/mssql-tools/bin/sqlcmd -U sa -P $$MSSQL_SA_PASSWORD -d master -i /startrek.sql
        sleep infinity

Note the env_file directive in the services definition. The file .env is the name of the file used. The ${SA_PWD} tells docker compose to look for SA_PWD in the .env file. Here is what the file looks like:

SA_PWD=This!s@very$trongP@ssw0rd

Conclusion

Simple and quick. There are no reasons to still have the password in clear text in the docker compose files anymore. Even for a quick demo! Of course for a production deployment there are stronger ways to manage sensitive information, but for a demo it's perfect and it's secure.

During Microsoft Build Keynote on day 2, Julia Liuson and John Lambert talked about how trade actors are not only looking for the big fishes, but also looking at simple demos and old pieces of code, looking for passwords, keys and sensitive information.

How to Deploy a .NET isolated Azure Function using Zip Deploy in One-Click

In this post, I will share a few things that we need our attention when deploying a .NET isolated Azure Function from GitHub to Azure using the Zip Deploy method. This method is great for fast deployment and when your artefacts are zipped in a package.

Note The complete code for this post is available on GitHub


Understanding Zip Push/Zip Deploy

Zip Push allows us to deploy a compressed package, such as a zip file, directly to Azure. It could be part of a continuous integration and continuous deployment (CI-CD) or like in this example it could replace it. This approach is particularly useful when you want to ensure your artifacts remain unchanged across different environments or when aiming for the fastest deployment experience for users.

While CI-CD is excellent for keeping your code up-to-date, zip deployment offers the advantage of speed and consistency. It eliminates the need for compilation, leading to quicker uploads and deployments.


Preparing Your Package

It’s crucial to package with all necessary dependencies the code required. There is no operation to fetch any external packages during the deployment, the zip file will be decompressed and that's it. The best way to ensure you have everything you need is to publish your code, to a folder and then go in that folder and zip all the files.

dotnet publish -c Release -o ./out

Don't zip the folder, it won't work as expected.

Don't zip the publish folder it won't works

You need to go inside the folder and select all the files and zip them to create your deployment artefact.

From in the publish folder zip all files

The next step is to make your artefact available online. There are many ways, but for this post we are using GitHub Realease. From the GitHub repository, create a new release, upload the zipped file created earlier and publish it. Note the URL of zipped files from the release.


Preparing The ARM Template

For this one-click deployment, we need an Azure Resource Manager (ARM) template. This is a document that describes the resources that we want to deploy to Azure. To deploy the zipped file into the Azure Function there are two particularities that required our attention.

Here the sections of the template.

[...]
"resources": [
    {
        "apiVersion": "2022-03-01",
        "name": "[variables('funcAppName')]",
        "type": "Microsoft.Web/sites",
        "kind": "functionapp",
        "location": "[resourceGroup().location]",
        "properties": {
            "name": "[variables('funcAppName')]",
            "siteConfig": {
                "appSettings": [
                    {
                        "name": "FUNCTIONS_WORKER_RUNTIME",
                        "value": "dotnet-isolated"
                    },
                    {
                        "name": "WEBSITE_RUN_FROM_PACKAGE",
                        "value": "1"
                    },
                    [...]

Here we define an Windows Azure Function and the WEBSITE_RUN_FROM_PACKAGE needs to be set to 1. The WEBSITE_RUN_FROM_PACKAGE is the key that tells Azure to use the zip file as the deployment artefact.

Then to specify where the zip file is located we need to add an extension to the Azure Function.

    {
      "type": "Microsoft.Web/sites/extensions",
      "apiVersion": "2021-02-01",
      "name": "[format('{0}/ZipDeploy', variables('funcAppName'))]",
      "properties": {
        "packageUri": "https://github.com/FBoucher/ZipDeploy-AzFunc/releases/download/v1/ZipDeploy-package-v1.zip",
        "appOffline": true
      },
      "dependsOn": [
        "[concat('Microsoft.Web/sites/', variables('funcAppName'))]"
      ]
    }

The packageUri property is the URL of the zipped file from the GitHub release. Note the dependsOn property that ensures the Azure Function is created before the extension is added. The complete ARM template is available in the GitHub repository.


One-click Deployment

When you have your artefact and the ARM template uploaded to your GitHub repository, you can create a one-click deployment button. This button will take the user to the Azure portal and pre-fill the deployment form with the information from the ARM template. Here is an example of the button for markdown.

[![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FFBoucher%2FZipDeploy-AzFunc%2Fmain%2Fdeployment%2Fazuredeploy.json)

The has three parts, the first is the image that will be displayed on the button, the second is the link to the Azure portal and the third is the URL of the ARM template. The URL of the ARM template is the raw URL of the file in the GitHub repository, and it needs to be URL encoded. The URL encoding can be done using a tool like URL Encode/Decode.

Final Thoughts

Zip deployment is a powerful tool in your Azure arsenal by itself of part of a more complex CI-CD pipeline. It's a great way to make it easier for people to deploy your solution in their Azure subscription without having to clone/ fork the repository.


Video version

If you prefer, there is also have a video version of this post.

References

Reading Notes #589

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

  • Understanding C# 8 default interface methods (Andrew Lock) - Very clear post about the new feature available in interfaces, with great examples that make us understand why and when it is useful and how to implement it.

Open Source

Podcasts

~Frank

Reading Notes #574


It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

Podcasts

  • Going Full Time on Open Source with Shaun Walker (.NET Rocks!) - The older will remember the amazing DotNetNuke. That OSS project was create by Shaun a few years ago. In this episode, they talk about his new project and how he is building it.

Miscellaneous

~frank

Reading Notes #567

Programming

DevOps

  • How to deploy Azure Container Apps (Shawn Sesna) - This is a grewt tutorial to get your container Apps deploy without having to care about to much infrastructure aka.kubernetes.

Podcasts

Frank

Reading Notes #563


It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you think you may have interesting content, share it!

Cloud

Programming

Open Source

  • Build an Open Source Project: Behind the Scenes (Alexey Yuzhakov) - This post shares a story of a real-life open source project. It's about putting in the effort and doing the extra work to make our project more useful and accessible.

Low Code

Podcast

~Frank

Reading Notes #546


Cloud

Programming

Podcasts

Books


  • Learning Blazor
    (David Pine) - This book is just perfect! It explains a bit of everything. It is packed with real examples and code variation (because there are so many ways to write something). There was even a full chapter un test with playwright, I didn't expect that and it was great!

Reading Notes #544


It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Programming

Podcasts

~Frank


Reading Notes #541


Already time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week. 

You think you may have interesting content, share it!

 

Cloud


Programming


Books



Author: Erica Dhawan 

It's been a while since I had so many "ah-ah" moment while reading a book. Digital body language is about communication using many different technologies by different culture, generations and individuals... 

It a must if you care about how your message are received.


Miscellaneous


~frank


Reading Notes #526

White Great Heron standing in the water

Good Monday, it's time to share new ReadingNotes. Here is a list of all the articles, podcasts, and blog posts, that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

Podcast

Miscellaneous


~Frank

Reading Notes #523

Good Monday, it's time to share new ReadingNotes. Here is a list of all the articles, podcasts, and blog posts, that catch my interest during the week.

If you think you may have interesting content, share it!

Suggestion of the week

Cloud

DevOps

Programming

Miscellaneous

~Frank



Reading Notes #517


Good Thursday, 
back from a nice time off, I'm all recharged and it's time to share new ReadingNotes. Here is a list of all the articles, podcasts, and blog posts, that catch my interest during the week. 

If you think you may have interesting content, share it!

Cloud


Programming


Podcasts


~frank


Reading Notes #513


Good Monday, it's time to share new readingnotes. Here is a list of all the articles, and blog posts, that catch my interest during the week. 

If you think you may have interesting content, share it!

The suggestion of the week

Cloud

Programming

Miscellaneous


~frank


Reading Notes #508


It's... Tuesday! 
Yes I know one day later, but it's still time to share my reading notes. Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed. 

If you think you may have interesting content, share it!

Cloud

Programming

Podcast

Miscellaneous


~frank


Reading Notes #503


It's Monday, time to share my reading notes. Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.

If you think you may have interesting content, share it!

Cloud

Programming


~Frank


Reading Notes #491


Good Monday, already time to share new reading notes. 
It is a habit I started a long time ago where I share a list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week.

You think you may have interesting content, share it!

Cloud


Programming


Databases


Miscellaneous


~frank