Welcome to another edition of my reading notes! This week brings some fascinating insights into AI's real-world impact, exciting developments in .NET and containerization, plus practical tools for improving our development workflows.
From local AI-powered code reviews to Docker security hardening and the upcoming .NET 10 features, there's plenty to explore.
AI
The promise that wasn’t kept (Salma Alam Maylor) - Interesting thoughts about AI and its impact on our work but also on our life, and the planet.
Inside GitHub: How we hardened our SAML implementation (Greg Ose, Taylor Reis) - Very interesting post that pushes the curtain a little bit so we could see behind the scene how this very used but notification system works and has been updated
Enhance productivity with AI + Remote Dev (Brigit Murtaugh, Christof Marti, Josh Spicer, Olivia Guzzardo McVicker) - I love the dev container environments, they are so useful! And I also use the remote one when I'm not on my dev device so easy. Happy to see that Copilot will be right there with me.
It's time for another edition of Reading Notes! This week brings exciting developments in the open source world, with major announcements from Microsoft making WSL and VS Code's AI features open source. We've also got updates on Azure Container Apps, .NET Aspire, and some great insights on developer productivity tools.
Let's dive into these interesting reads that caught my attention this week.
Cloud
Happy 5th Birthday Bicep! (Sam Cogan) - What?! Five years already! That's incredible, I remember all the discussion about how we make our business better and honestly, bicep is a big success. Congrats to the team
Have I Been Pwned 2.0 is Now Live! (Troy Hunt) - New look, new merch, and confetti, all without API breaking changes! Learn all about this major update in this post.
What's new in .NET Aspire 9.3 (David Pine) - Wow! How so many great new features can be added in a single version?! Aspire is a must for all .NET developers.
Accelerate Your .NET Upgrades with GitHub Copilot (McKenna Barlow) - That's the tool I've been waiting for ages! Adding a Copilot to the extension is the smartest move they could make. I'm going to update an app right away. I'll share more later
Open Source
Edit is now open source (Christopher Nguyen) - That's a great news! I installed it half through the post and it great! Fast, simple, and tiny!! Love it!
Agent mode for every developer (Katie Savage) - Great new for everyone as the agent mode become available in so many different editor. This post also contains videos to shows some scenarios.
Podcasts
Reimagining the Windows Terminal with Warp's Zach Lloyd (Scott Hanselman) - A very interesting talk with the CEO of Warp that answers so many questions I had about this very different terminal. Really interesting episode, and terminal too BTW)
The experience is enough (Salma Alam-Naylor) - Whether we like it or not, we are people creature. We all need to stop hiding behind our screens and get out there!
Automating deployments is something I always enjoy. However, it's true that it often takes more time than a simple "right-click deploy." Plus, you may need to know different technologies and scripting languages.
But what if there was a tool that could help you write everything you need—Infrastructure as Code (IaC) files, scripts to copy files, and scripts to populate a database? In this post, we'll explore how the Azure Developer CLI (azd) can make deployments much easier.
What do we want to do?
Our goal: Deploy the 2D6 Dungeon App to Azure Container Apps.
This .NET Aspire solution includes:
A frontend
A data API
A database
The Problem
In a previous post, we showed how azd up can easily deploy web apps to Azure.
If we try the same command for this solution, the deployment will be successful—but incomplete:
The .NET Blazor frontend is deployed perfectly.
However, the app fails when trying to access data.
Looking at the logs, we see the database wasn't created or populated, and the API container fails to start.
Let's look more closely at these issues.
The Database
When running the solution locally, Aspire creates a MySQL container and executes SQL scripts to create and populate the tables. This is specified in the AppHost project:
var mysql = builder.AddMySql("sqlsvr2d6")
.WithLifetime(ContainerLifetime.Persistent);
var db2d6 = mysql.AddDatabase("db2d6");
mysql.WithInitBindMount(source: "../../database/scripts", isReadOnly: false);
When MySQL starts, it looks for SQL files in a specific folder and executes them. Locally, this works because the bind mount is mapped to a local folder with the files.
However, when deployed to Azure:
The mounts are created in Azure Storage Files
The files are missing!
The Data API
This project uses Data API Builder (dab). Based on a single config file, a full data API is built and hosted in a container.
Locally, Aspire creates a DAB container and reads the JSON config file to create the API. This is specified in the AppHost project:
var dab = builder.AddDataAPIBuilder("dab", ["../../database/dab-config.json"])
.WithReference(db2d6)
.WaitFor(db2d6);
But once again, when deployed to Azure, the file is missing. The DAB container starts but fails to find the config file.
The Solution
The solution is simple: the SQL scripts and DAB config file need to be uploaded into Azure Storage Files during deployment.
You can do this by adding a post-provision hook in the azure.yaml file to execute a script that uploads the files. See an example of a post-provision hook in this post.
Alternatively, you can leverage azd alpha features: azd.operations and infraSynth.
azd.operations extends the provisioning providers and will upload the files for us.
infraSynth generates the IaC files for the entire solution.
💡Note: These features are in alpha and subject to change.
Each azd alpha feature can be turned on individually. To see all features:
azd config list-alpha
To activate the features we need:
azd config set alpha.azd.operations on
azd config set alpha.infraSynth on
Let's Try It
Once the azd.operation feature is activated, any azd up will now upload the files into Azure. If you check the database, you'll see that the db2d6 database was created and populated. Yay!
However, the DAB API will still fail to start. Why? Because, currently, DAB looks for a file, not a folder, when it starts. This can be fixed by modifying the IaC files.
One Last Step: Synthesize the IaC Files
First, let's synthesize the IaC files. These Bicep files describe the required infrastructure for our solution.
With the infraSynth feature activated, run:
azd infra synth
You'll now see a new infra folder under the AppHost project, with YAML files matching the container names. Each file contains the details for creating a container.
Open the dab.tmpl.yaml file to see the DAB API configuration. Look for the volumeMounts section. To help DAB find its config file, add subPath: dab-config.json to make the binding more specific:
You can also specify the scaling minimum and maximum number of replicas if you wish.
Now that the IaC files are created, azd will use them. If you run azd up again, the execution time will be much faster—azd deployment is incremental and only does "what changed."
The Final Result
The solution is now fully deployed:
The database is there with the data
The API works as expected
You can use your application!
Bonus: Deploying with CI/CD
Want to deploy with CI/CD? First, generate the GitHub Action (or Azure DevOps) workflow with:
azd pipeline config
Then, add a step to activate the alpha feature before the provisioning step in the azure-dev.yml file generated by the previous command.
- name: Extends provisioning providers with azd operations
run: azd config set alpha.azd.operations on
With these changes, and assuming the infra files are included in the repo, the deployment will work on the first try.
Conclusion
It's exciting to see how tools like azd are shaping the future of development and deployment. Not only do they make the developer's life easier today by automating complex tasks, but they also ensure you're ready for production with all the necessary Infrastructure as Code (IaC) files in place. The journey from code to cloud has never been smoother!
If you have any questions or feedback, I'm always happy to help—just reach out on your favorite social media platform.