This week I'm looking at some interesting .NET stuff like Typemock's architecture and how Copilot Studio uses WebAssembly to boost performance. There's also a good reminder about why setting up CI/CD early (when your app is tiny) saves you tons of headaches later. Plus, I found a couple of great podcast episodes on building modern SaaS products and what actually makes a personal brand different from just having a reputation.
Why you should set up CI/CD from day one for your apps - LogRocket Blog (Lewis Cianci) - And I suppose that gives many reasons to start with the cicd, but on top of that, I would say when your app is super small ,it's much easier to create that cicd that at the end when it's super complex
In a recent post, I shared how to set up a CI/CD pipeline for a .NET Aspire project on GitLab. The pipeline includes unit tests, security scanning, and secret detection, and if any of those fail, the pipeline would fail. Great, but what about code coverage for the unit tests? The pipeline included code coverage commands, but the coverage was not visible in the GitLab interface. Let's fix that.
One thing I initially thought was that the regex used to extract the coverage was incorrect. The regex used in the pipeline was:
coverage: '/Total\s*\|\s*(\d+(?:\.\d+)?)%/'
That regex came directly from the GitLab documentation, so I thought it should work correctly. However, coverage still wasn't visible in the GitLab interface.
So with the help of GitHub Copilot, we wrote a few commands to validate:
That the coverage.cobertura.xml was in a consistent location (instead of being in a folder with a GUID name)
That the coverage.cobertura.xml file was in a valid format
What exactly the regex was looking for
Everything checked out fine, so why was the coverage not visible?
The Solution
It turns out that the coverage command with the regex expression is scanning the console output and not the coverage.cobertura.xml file. Aha! One solution was to install dotnet-tools to changing where the the test results was persisted; to the console instead of the XML file, but I preferred keeping the .NET environment unchanged.
The solution I ended up implementing was executing a grep command to extract the coverage from the coverage.cobertura.xml file and then echoing it to the console. Here's what it looks like:
I hope this helps others save time when setting up code coverage for their .NET projects on GitLab. The key insight is that GitLab's coverage regex works on console output, not on the files (XML or other formats).
If you have any questions or suggestions, feel free to reach out!
Getting a complete CI/CD pipeline for your .NET Aspire solution doesn't have to be complicated. I've created a template that gives you everything you need to get started in minutes.
Replace the sample project with your own .NET Aspire code
Push to your GitLab repository
Watch your CI/CD pipeline run automatically
That's it! You immediately get automated builds, testing, and security scanning.
Pro Tip: The best time to set up CI/CD is when you're just starting your project because everything is still simple.
Part 2: Building the Template with GitLab Duo
Now let me share my experience creating this template using GitLab's AI assistant, GitLab Duo.
Starting Simple, Growing Smart
I didn't build this complex pipeline all at once. I started with something very basic and used GitLab Duo to gradually add features. The AI helped me:
Add secret detection when I asked: "How can I scan for accidentally committed secrets?"
Fix test execution issues when my unit tests weren't running properly
Optimize the pipeline structure for better performance
Working with GitLab in VS Code
While you can edit .gitlab-ci.yml files directly in GitLab's web interface, I prefer VS Code. Here's my setup:
Install the official GitLab extension from the VS Code marketplace
Once you've signed in, this extension gives you:
Direct access to GitLab issues and work items
AI-powered chat with GitLab Duo
GitLab Duo in Action
GitLab Duo became my pair programming partner. Here's how I used it:
Understanding Code: I could type /explain and ask Duo to explain what any part of my pipeline configuration does by highlighting that section.
Solving Problems: When my solution didn't compile, I described the issue to Duo and got specific suggestions. For example, it helped me realize some projects weren't in .NET 9 because dotnet build required the Aspire workload. I could either keep my project in .NET 8 and add a before_script instruction to install the workload or upgrade to .NET 9; I picked the latest.
Adding Features: I started with just build and test, then incrementally asked Duo to help me add security scanning, secret detection, and better error handling.
Adding Context: Using /include to add the project file or the .gitlab-ci.yml file while asking questions helped Duo understand the context better.
Learn More with the Docs: During my journey, I knew Duo wasn't just making things up as it was referencing the documentation. I could continue my learning there and read more examples of how before_script is used in different contexts.
The AI-Assisted Development Experience
What impressed me most was how GitLab Duo helped me learn while building. Instead of just copying configurations from documentation, each conversation taught me something new about GitLab CI/CD best practices.
Conclusion
I think this template can be useful for anyone starting a .NET Aspire project. Ready to try it? Clone the template at cloud5mins/aspire-template and start building with confidence.
Whether you're new to .NET Aspire or CI/CD, this template gives you a good foundation. And if you want to customize it further, GitLab Duo is there to help you understand and modify the configuration.
If you think we should add more features or improve the template, feel free to open an issue in the repository. Your feedback is always welcome!
This week, we're exploring a wide range of topics, from .NET 10 previews and A/B testing to the latest in Azure development and AI. Plus, a selection of insightful podcast episodes to keep you informed and inspired.
Docker Model Runner ( DevOps and Docker Talk: Cloud Native Interviews and Tooling) - I tried the new model feature of Docker and had many questions. All of them were answered during this episode.
Michael Washington: The Nature Of Data - Episode 353 (Azure & DevOps Podcast) - Interesting discussion about data, and a bit more about a really cool project, Michael's Data warehouse, because sometimes we need something that runs locally.
Testing has always been one of those tasks that developers know is essential but often find tedious. When I decided to add comprehensive unit tests to my NoteBookmark project, I thought: why not make this an experiment in AI-assisted development? What followed was a fascinating 4-hour journey that resulted in 88 unit tests, a complete CI/CD pipeline, and some valuable insights about working with AI coding assistants.
NoteBookmark is a .NET application built with C# that helps users manage and organize their reading notes and bookmarks. The project includes an API, a Blazor frontend, and uses Azure services for storage. You can check out the complete project on GitHub.
The Challenge: Starting from Zero
I'll be honest - it had been a while since I'd written comprehensive unit tests. Rather than diving in myself, I decided to see how different AI models would approach this task. My initial request was deliberately vague: "add a test project" without any other specifications.
Looking back, I realize I should have been more specific about which parts of the code I wanted covered. This would have made the review process easier and given me better control over the scope. But sometimes, the best learning comes from letting the AI surprise you.
The Great AI Model Comparison
GPT-4.1: Competent but Quiet
GPT-4.1 delivered decent results, but the experience felt somewhat mechanical. The code it generated was functional, but I found myself wanting more context. The explanations were minimal, and I often had to ask follow-up questions to understand the reasoning behind certain test approaches.
Gemini: The False Start
My experience with Gemini was... strange. Perhaps it was a glitch or an off day, but most of what was generated simply didn't work. I didn't persist with this model for long, as debugging AI-generated code that fundamentally doesn't function defeats the purpose of the exercise. Note that at the time of this writing, Gemini was still in preview, so I expect it to improve over time.
Claude Sonnet: The Clear Winner
This is where the magic happened. Claude Sonnet became my co-pilot of choice for this project. What set it apart wasn't just the quality of the code (though that was excellent), but the quality of the conversation. It felt like having a thoughtful colleague thinking out loud with me.
The explanations were clear and educational. When Claude suggested a particular testing approach, it would explain why. When it encountered a complex scenario, it would walk through its reasoning. I tried different versions of Claude Sonnet but didn't notice significant differences in results - they were all consistently good.
The Development Process: A 4-Hour Journey
Hour 1-2: Getting to Compilation
The first iteration couldn't compile. This wasn't surprising given the complexity of the codebase and the vague initial request. But here's where the AI collaboration really shined. Instead of manually debugging everything myself, I worked with Copilot to identify and fix issues iteratively.
We went through several rounds of:
Identify compilation errors
Discuss the best approach to fix them
Let the AI implement the fixes
Review and refine
After about 2 hours, we had a test project with 88 unit tests that compiled successfully. The AI had chosen xUnit as the testing framework, which I was happy with - it's a solid choice that I might not have picked myself if I was rusty on the current .NET testing landscape.
Hour 2.5-3.5: Making Tests Pass
Getting the tests to compile was one thing; getting them to pass was another challenge entirely. This phase taught me a lot about both my codebase and xUnit features I wasn't familiar with.
I relied heavily on the /explain feature during this phase. When tests failed, I'd ask Claude to explain what was happening and why. This was invaluable for understanding not just the immediate fix, but the underlying testing concepts.
One of those moment was learning about [InlineData(true)] and other xUnit data attributes. These weren't features I was familiar with, and having them explained in context made them immediately useful.
Hour 3.5-4: Structure and Style
Once all tests were passing, I spent time ensuring I understood each test and requesting structural changes to match my preferences. This phase was crucial for taking ownership of the code. Just because AI wrote it doesn't mean it should remain a black box. Let's repeat this: Understanding the code is essential; just because AI wrote it doesn't mean it's good.
Beyond Testing: CI/CD Integration
With the tests complete, I asked Copilot to create a GitHub Actions workflow to run tests on every push to main and v-next branches, plus PR reviews. Initially it started modifiying my existing workflow that takess care of the Azure deployment. I wanted a separate workflow for testing, so I interrupted (that's nice I wasn't "forced" to wait), and asked it to create a new one instead. The result was the running-unit-tests.yml workflow that worked perfectly on the first try.
This was genuinely surprising. CI/CD configurations often require tweaking, but the generated workflow handled:
Multi-version .NET setup
Dependency restoration
Building and testing
Test result reporting
Code coverage analysis
Artifact uploading
The PR Enhancement Adventure
Here's where things got interesting. When I asked Copilot to enhance the workflow to show test results in PRs, it started adding components, then paused and asked if it could delete the current version and start from scratch.
I said yes, and I'm glad I did. The rebuilt version created beautiful PR comments showing:
Test results summary
Code coverage reports (which I didn't ask for but appreciated)
Detailed breakdowns.
The Finishing Touches
No project is complete without proper status indicators. I added a test status badge to the README, giving anyone visiting the repository immediate visibility into the project's health.
Key Takeaways
What Worked Well
AI as a Learning Partner: Having Copilot explain testing concepts and xUnit features was like having a patient teacher
Iterative Refinement: The back-and-forth process felt natural and productive
Comprehensive Solutions: The AI didn't just write tests; it created a complete testing infrastructure
Quality Over Speed: While it took 4 hours, the result was thorough and well-structured
What I'd Do Differently
Be More Specific Initially: Starting with clearer scope would have streamlined the process
Set Testing Priorities: Identifying critical paths first would have been valuable
Plan for Visual Test Reports: Thinking about test result visualization from the start
Lessons About AI Collaboration
Model Choice Matters: The difference between AI models was significant
Conversation Quality Matters: Clear explanations make the collaboration more valuable
Trust but Verify: Understanding every piece of generated code is crucial
Embrace Iteration: The best results come from multiple refinement cycles
The Bigger Picture
This experiment reinforced my belief that AI coding assistants are most powerful when they're true collaborators rather than code generators. The value wasn't just in the 88 tests that were written, but in the learning that happened along the way.
For developers hesitant about AI assistance in testing: this isn't about replacing your testing skills, it's about augmenting them. The AI handles the boilerplate and suggests patterns, but you bring the domain knowledge and quality judgment.
Conclusion
Would I do this again? Absolutely. The combination of comprehensive test coverage, learning opportunities, and time efficiency made this a clear win. The 4 hours invested created not just tests, but a complete testing infrastructure that will pay dividends throughout the project's lifecycle.
If you're considering AI-assisted testing for your own projects, my advice is simple: start the conversation, be prepared to iterate, and don't be afraid to ask "why" at every step. The goal isn't just working code - it's understanding and owning that code.
The complete test suite and CI/CD pipeline are available in the NoteBookmark repository if you want to see the results of this AI collaboration in action.
Automating deployments is something I always enjoy. However, it's true that it often takes more time than a simple "right-click deploy." Plus, you may need to know different technologies and scripting languages.
But what if there was a tool that could help you write everything you need—Infrastructure as Code (IaC) files, scripts to copy files, and scripts to populate a database? In this post, we'll explore how the Azure Developer CLI (azd) can make deployments much easier.
What do we want to do?
Our goal: Deploy the 2D6 Dungeon App to Azure Container Apps.
This .NET Aspire solution includes:
A frontend
A data API
A database
The Problem
In a previous post, we showed how azd up can easily deploy web apps to Azure.
If we try the same command for this solution, the deployment will be successful—but incomplete:
The .NET Blazor frontend is deployed perfectly.
However, the app fails when trying to access data.
Looking at the logs, we see the database wasn't created or populated, and the API container fails to start.
Let's look more closely at these issues.
The Database
When running the solution locally, Aspire creates a MySQL container and executes SQL scripts to create and populate the tables. This is specified in the AppHost project:
var mysql = builder.AddMySql("sqlsvr2d6")
.WithLifetime(ContainerLifetime.Persistent);
var db2d6 = mysql.AddDatabase("db2d6");
mysql.WithInitBindMount(source: "../../database/scripts", isReadOnly: false);
When MySQL starts, it looks for SQL files in a specific folder and executes them. Locally, this works because the bind mount is mapped to a local folder with the files.
However, when deployed to Azure:
The mounts are created in Azure Storage Files
The files are missing!
The Data API
This project uses Data API Builder (dab). Based on a single config file, a full data API is built and hosted in a container.
Locally, Aspire creates a DAB container and reads the JSON config file to create the API. This is specified in the AppHost project:
var dab = builder.AddDataAPIBuilder("dab", ["../../database/dab-config.json"])
.WithReference(db2d6)
.WaitFor(db2d6);
But once again, when deployed to Azure, the file is missing. The DAB container starts but fails to find the config file.
The Solution
The solution is simple: the SQL scripts and DAB config file need to be uploaded into Azure Storage Files during deployment.
You can do this by adding a post-provision hook in the azure.yaml file to execute a script that uploads the files. See an example of a post-provision hook in this post.
Alternatively, you can leverage azd alpha features: azd.operations and infraSynth.
azd.operations extends the provisioning providers and will upload the files for us.
infraSynth generates the IaC files for the entire solution.
💡Note: These features are in alpha and subject to change.
Each azd alpha feature can be turned on individually. To see all features:
azd config list-alpha
To activate the features we need:
azd config set alpha.azd.operations on
azd config set alpha.infraSynth on
Let's Try It
Once the azd.operation feature is activated, any azd up will now upload the files into Azure. If you check the database, you'll see that the db2d6 database was created and populated. Yay!
However, the DAB API will still fail to start. Why? Because, currently, DAB looks for a file, not a folder, when it starts. This can be fixed by modifying the IaC files.
One Last Step: Synthesize the IaC Files
First, let's synthesize the IaC files. These Bicep files describe the required infrastructure for our solution.
With the infraSynth feature activated, run:
azd infra synth
You'll now see a new infra folder under the AppHost project, with YAML files matching the container names. Each file contains the details for creating a container.
Open the dab.tmpl.yaml file to see the DAB API configuration. Look for the volumeMounts section. To help DAB find its config file, add subPath: dab-config.json to make the binding more specific:
You can also specify the scaling minimum and maximum number of replicas if you wish.
Now that the IaC files are created, azd will use them. If you run azd up again, the execution time will be much faster—azd deployment is incremental and only does "what changed."
The Final Result
The solution is now fully deployed:
The database is there with the data
The API works as expected
You can use your application!
Bonus: Deploying with CI/CD
Want to deploy with CI/CD? First, generate the GitHub Action (or Azure DevOps) workflow with:
azd pipeline config
Then, add a step to activate the alpha feature before the provisioning step in the azure-dev.yml file generated by the previous command.
- name: Extends provisioning providers with azd operations
run: azd config set alpha.azd.operations on
With these changes, and assuming the infra files are included in the repo, the deployment will work on the first try.
Conclusion
It's exciting to see how tools like azd are shaping the future of development and deployment. Not only do they make the developer's life easier today by automating complex tasks, but they also ensure you're ready for production with all the necessary Infrastructure as Code (IaC) files in place. The journey from code to cloud has never been smoother!
If you have any questions or feedback, I'm always happy to help—just reach out on your favorite social media platform.
Welcome to this week's reading notes! Dive into the latest on Microsoft's new AI chat template, explore Docker's MCP CLI, learn about integrating AI into .NET applications, and discover how to automate .NET MAUI library publishing with GitHub Actions.
Whether you're interested in AI advancements, programming techniques, or DevOps practices, there's something valuable waiting for you below.
MsDevMtl Meetup - Uno
Suggestion of the week
Exploring the new AI chat template (Andrew Lock) - This new .NET template looks pretty useful and has a lot of components already baked in. This post explains those and shows how to customize it for our usage.
Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.
How to Use the Alpine Docker Official Image (Tyler Charboneau) - Alpine is a must to know for all dev and DevOps people (and probably many more}. This post is perfect for getting you started.
Why the C64 Demoscene matters with Clay Token's Bilgem Çakır (Hanselminutes with Scott Hanselman) - Scott has been sharing content about the Commodore 64 these days and that make me wanna jump back as it was my first computer... But this time I would understand English without having to search for each word in a dictionary.
Good Monday, it's time to share new readingnotes. Here is a list of all the articles, and blog posts, that catch my interest during the week.
If you think you may have interesting content, share it!
The suggestion of the week
Deploy Azure Static Web Apps With Bicep | LINQ to Fail (Aaron Powell) - Great tutorial that explains how to build a well-structured deployment pipeline using bicep and GitHub action. I will need this for sure, bookmarked.
ApiController Attribute in ASP.NET Core Web API (Code Maze) - This post contained many best practices and detailed explanations to get a great API and make sure the user experience is the best possible.
New Resources to Get Started with .NET MAUI - .NET Blog (Matt Soucoup) - Are you planning to learn something new this summer? I suggest you .NET MAUI, to build an application that can go everywhere. This post shares tons of references to get you started.
Good Monday, Already time to share new reading notes. Here is a list of all the articles, blog posts, and podcast episodes that catch my interest during the week.
If you think you may have interesting content, share it!
Good Monday, Already time to share new reading notes. Here is a list of all the articles, blog posts, and podcast episodes that catch my interest during the week.
If you think you may have interesting content, share it!
Build configuration for Azure Static Web Apps (craigshoemaker, anthonychu, Reshmi-Sriram, changeworld) - Continuous deployment is so powerful it is very useful to understand the options and how you can set your things together.
It's Monday, time to share my reading notes. Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.
If you think you may have interesting content, share it!
Configure Azure Cosmos DB Continuous Backups (Rajendra Gupta) - Backup can be so powerful! You could return in time just before an error to understand what happens... Or so many other scenarios.
How To Run PowerShell Scripts (Brien Posey) - A script can be frightening at first, but this nice post will help you to understand them better. Perfect for less technical people.
Introducing Qodana for Azure Pipelines (Anastasia Khramushina) - Qodana is can analyze your code in CICD on many platforms, and now also in the Azure DevOps.
Monday means it's reading notes times. Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.
You think you may have interesting content, share it!
Azure Apps Autopilot (Justin Yoo) - A great DevOps post that great an automatic deployment process. Very inspiring, I think I may is some of it for my AzUrlShortener.
Windows Package Manager 1.2 (Demitrius Nelon) - I just try it and it works so well. This is good news and there are already so many packages available.
Entrepreneurship as a developer - (Software Engineering Unlocked) - WHat does it means to think at scale. I am not quite sure I would enjoy doing it his way, but it's definitely working.
Create a Windows 10 development virtual machine (Thomas Maurer) - Great tutorial to create a dev environment.So useful when we need to create a specific context or use a different version to investigate client issues.
Visual Studio 2022 (Amanda Silver) - I'm always impressed by how new features are continuously added to VS. Such a great tool.
Miscellaneous
Why you should never quit too early ... (Donn Felker) - An inspiring post to help us persevere and time is hard. But also to try to step aside and have an open mindset.
Épisode 84 - Le bonheur de s’entraîner (Grand écart) - I'm not a runner, but I like (or use to) move and be outside. This francophone podcast is really motivating, and interesting.
Eating Frogs with Brian Tracy (The Productivityist Podcast) - I just found this podcast. I liked that book when I read it a few years ago. It was nice listening to this episode talking about it. Brought back great ideas.
Épisode 5 - La Chasse aux Sorcières (Les Pires Moments de l'Histoire) - Okay, THIS IS A MUST. Seriously is yo understand French, it's both educational and funny. Great job!
2020 sucked - A year-end wrap-up with Scott's Wife, Mo (Hanselminutes with Scott Hanselman) - I'm so glad Mo accepted to comes on the show once more. This dynamic is very interesting. I need to make my wife listen to this episode and to my daughter too (she studying to be a nurse).
628 - How to Be Confident, Not Arrogant (Modern Mentor) - The line between those two is easy to cross. I appreciated the reflection time about it, that this episode offers.
The Infinite Game with Dr. James Carse (A Bit of Optimism) - The Infinite Game, a great book that I read last year. It was awesome to listen to Simon talk about those ideas with Dr. Carse.
15+ Chrome extensions you should have in your pocket. (Jane Tracy) - A nice list of extensions. I have a few of them already. It's Always hard to find the balance between all the ones that are great and what you really need.
YouTube Channel Art Ultimate Guide (Your Thriving Side Hustle) - I really go show. I was just thinking about that in fact, the timing was excellent! Very interesting..
Every week, I publish my reading notes. Those are the articles, blog posts, podcast episodes, and books that catch my interest and that I found interesting. It's a mix of the actuality and what I was looking for.
Cloud
Azure Mystery Mansion - Microsoft in Business Blogs (Em Lazer-Walker) - Very interesting post. As I'm writing myself a text-based adventure game(just for fun) it's interesting to see the different approaches and tools available.