Showing posts with label dotnet. Show all posts
Showing posts with label dotnet. Show all posts

Reading Notes #653

Welcome to Reading Notes #653—another packed edition of insights, tools, and updates from the tech world! This week's roundup dives into legendary engineering wisdom, AI controversies, and the latest innovations in Docker, Azure, and VS Code. Whether you're exploring MCP, refining your scripting skills, or gearing up for the newest Azure Developer CLI release, there's something here for every developer.

windmill on the cap of Ile Perrot

Let’s get into it!

Cloud

  • Azure Developer CLI (azd) - June 2025 (Kristen Womack) - Love that tool, great updates, so many new features and improvements in this version, very looking forward to try all of them, turning them all

AI

Programming

Miscellaneous


~frank

Reading Notes #652

This week, we explore a variety of topics, from database containerization and AI security risks to the evolving landscape of gaming devices and cloud technologies. We also explore the shift towards security-first development and the integration of .NET Aspire with SQL Server for integration testing.


Let's dive in!

Suggestion of the week

Cloud

Programming

Databases

Miscellaneous

~frank

I Co-Wrote 88 Unit Tests Using AI: A Developer's Journey

Testing has always been one of those tasks that developers know is essential but often find tedious. When I decided to add comprehensive unit tests to my NoteBookmark project, I thought: why not make this an experiment in AI-assisted development? What followed was a fascinating 4-hour journey that resulted in 88 unit tests, a complete CI/CD pipeline, and some valuable insights about working with AI coding assistants.

(Version française ici)

The Project: NoteBookmark

NoteBookmark is a .NET application built with C# that helps users manage and organize their reading notes and bookmarks. The project includes an API, a Blazor frontend, and uses Azure services for storage. You can check out the complete project on GitHub.

The Challenge: Starting from Zero

I'll be honest - it had been a while since I'd written comprehensive unit tests. Rather than diving in myself, I decided to see how different AI models would approach this task. My initial request was deliberately vague: "add a test project" without any other specifications.

Looking back, I realize I should have been more specific about which parts of the code I wanted covered. This would have made the review process easier and given me better control over the scope. But sometimes, the best learning comes from letting the AI surprise you.

The Great AI Model Comparison



GPT-4.1: Competent but Quiet

GPT-4.1 delivered decent results, but the experience felt somewhat mechanical. The code it generated was functional, but I found myself wanting more context. The explanations were minimal, and I often had to ask follow-up questions to understand the reasoning behind certain test approaches.

Gemini: The False Start

My experience with Gemini was... strange. Perhaps it was a glitch or an off day, but most of what was generated simply didn't work. I didn't persist with this model for long, as debugging AI-generated code that fundamentally doesn't function defeats the purpose of the exercise. Note that at the time of this writing, Gemini was still in preview, so I expect it to improve over time.

Claude Sonnet: The Clear Winner

This is where the magic happened. Claude Sonnet became my co-pilot of choice for this project. What set it apart wasn't just the quality of the code (though that was excellent), but the quality of the conversation. It felt like having a thoughtful colleague thinking out loud with me.

The explanations were clear and educational. When Claude suggested a particular testing approach, it would explain why. When it encountered a complex scenario, it would walk through its reasoning. I tried different versions of Claude Sonnet but didn't notice significant differences in results - they were all consistently good.

The Development Process: A 4-Hour Journey


Hour 1-2: Getting to Compilation

The first iteration couldn't compile. This wasn't surprising given the complexity of the codebase and the vague initial request. But here's where the AI collaboration really shined. Instead of manually debugging everything myself, I worked with Copilot to identify and fix issues iteratively.

We went through several rounds of:

  1. Identify compilation errors
  2. Discuss the best approach to fix them
  3. Let the AI implement the fixes
  4. Review and refine

After about 2 hours, we had a test project with 88 unit tests that compiled successfully. The AI had chosen xUnit as the testing framework, which I was happy with - it's a solid choice that I might not have picked myself if I was rusty on the current .NET testing landscape.

Hour 2.5-3.5: Making Tests Pass

Getting the tests to compile was one thing; getting them to pass was another challenge entirely. This phase taught me a lot about both my codebase and xUnit features I wasn't familiar with.

I relied heavily on the /explain feature during this phase. When tests failed, I'd ask Claude to explain what was happening and why. This was invaluable for understanding not just the immediate fix, but the underlying testing concepts.

One of those moment was learning about [InlineData(true)] and other xUnit data attributes. These weren't features I was familiar with, and having them explained in context made them immediately useful.


InlineData in the code


Hour 3.5-4: Structure and Style

Once all tests were passing, I spent time ensuring I understood each test and requesting structural changes to match my preferences. This phase was crucial for taking ownership of the code. Just because AI wrote it doesn't mean it should remain a black box. Let's repeat this: Understanding the code is essential; just because AI wrote it doesn't mean it's good.

Beyond Testing: CI/CD Integration

With the tests complete, I asked Copilot to create a GitHub Actions workflow to run tests on every push to main and v-next branches, plus PR reviews. Initially it started modifiying my existing workflow that takess care of the Azure deployment. I wanted a separate workflow for testing, so I interrupted (that's nice I wasn't "forced" to wait), and asked it to create a new one instead. The result was the running-unit-tests.yml workflow that worked perfectly on the first try.

This was genuinely surprising. CI/CD configurations often require tweaking, but the generated workflow handled:

  • Multi-version .NET setup
  • Dependency restoration
  • Building and testing
  • Test result reporting
  • Code coverage analysis
  • Artifact uploading

Code coverage


The PR Enhancement Adventure

Here's where things got interesting. When I asked Copilot to enhance the workflow to show test results in PRs, it started adding components, then paused and asked if it could delete the current version and start from scratch.

I said yes, and I'm glad I did. The rebuilt version created beautiful PR comments showing:

  • Test results summary
  • Code coverage reports (which I didn't ask for but appreciated)
  • Detailed breakdowns.

PR display


The Finishing Touches

No project is complete without proper status indicators. I added a test status badge to the README, giving anyone visiting the repository immediate visibility into the project's health.

test status badge


Key Takeaways


What Worked Well

  1. AI as a Learning Partner: Having Copilot explain testing concepts and xUnit features was like having a patient teacher
  2. Iterative Refinement: The back-and-forth process felt natural and productive
  3. Comprehensive Solutions: The AI didn't just write tests; it created a complete testing infrastructure
  4. Quality Over Speed: While it took 4 hours, the result was thorough and well-structured

What I'd Do Differently

  1. Be More Specific Initially: Starting with clearer scope would have streamlined the process
  2. Set Testing Priorities: Identifying critical paths first would have been valuable
  3. Plan for Visual Test Reports: Thinking about test result visualization from the start

Lessons About AI Collaboration

  • Model Choice Matters: The difference between AI models was significant
  • Conversation Quality Matters: Clear explanations make the collaboration more valuable
  • Trust but Verify: Understanding every piece of generated code is crucial
  • Embrace Iteration: The best results come from multiple refinement cycles

The Bigger Picture

This experiment reinforced my belief that AI coding assistants are most powerful when they're true collaborators rather than code generators. The value wasn't just in the 88 tests that were written, but in the learning that happened along the way.

For developers hesitant about AI assistance in testing: this isn't about replacing your testing skills, it's about augmenting them. The AI handles the boilerplate and suggests patterns, but you bring the domain knowledge and quality judgment.

Conclusion

Would I do this again? Absolutely. The combination of comprehensive test coverage, learning opportunities, and time efficiency made this a clear win. The 4 hours invested created not just tests, but a complete testing infrastructure that will pay dividends throughout the project's lifecycle.

If you're considering AI-assisted testing for your own projects, my advice is simple: start the conversation, be prepared to iterate, and don't be afraid to ask "why" at every step. The goal isn't just working code - it's understanding and owning that code.

The complete test suite and CI/CD pipeline are available in the NoteBookmark repository if you want to see the results of this AI collaboration in action.


Reading Notes #651

Welcome to another edition of my reading notes! This week brings some fascinating insights into AI's real-world impact, exciting developments in .NET and containerization, plus practical tools for improving our development workflows. 
A duck in a city fontain

From local AI-powered code reviews to Docker security hardening and the upcoming .NET 10 features, there's plenty to explore.

 

AI

Programming

Cloud

Miscellaneous

  • Enhance productivity with AI + Remote Dev (Brigit Murtaugh, Christof Marti, Josh Spicer, Olivia Guzzardo McVicker) - I love the dev container environments, they are so useful! And I also use the remote one when I'm not on my dev device so easy. Happy to see that Copilot will be right there with me.
~frank

Reading Notes #650

It's time for another edition of Reading Notes! This week brings exciting developments in the open source world, with major announcements from Microsoft making WSL and VS Code's AI features open source. We've also got updates on Azure Container Apps, .NET Aspire, and some great insights on developer productivity tools.
 
Let's dive into these interesting reads that caught my attention this week.

Cloud

Programming

Open Source

AI

  • Agent mode for every developer (Katie Savage) - Great new for everyone as the agent mode become available in so many different editor. This post also contains videos to shows some scenarios.

Podcasts

Miscellaneous

  • The experience is enough (Salma Alam-Naylor) - Whether we like it or not, we are people creature. We all need to stop hiding behind our screens and get out there!

~frank

Full-Stack Azure Deployment Made Easy: Containers, Databases, and More

Automating deployments is something I always enjoy. However, it's true that it often takes more time than a simple "right-click deploy." Plus, you may need to know different technologies and scripting languages.

(Version française ici)

But what if there was a tool that could help you write everything you need—Infrastructure as Code (IaC) files, scripts to copy files, and scripts to populate a database? In this post, we'll explore how the Azure Developer CLI (azd) can make deployments much easier.

What do we want to do?

Our goal: Deploy the 2D6 Dungeon App to Azure Container Apps.

This .NET Aspire solution includes:

  • A frontend
  • A data API
  • A database

Aspire resources schema


The Problem

In a previous post, we showed how azd up can easily deploy web apps to Azure.

If we try the same command for this solution, the deployment will be successful—but incomplete:

  • The .NET Blazor frontend is deployed perfectly.
  • However, the app fails when trying to access data.
  • Looking at the logs, we see the database wasn't created or populated, and the API container fails to start.

Let's look more closely at these issues.

The Database

When running the solution locally, Aspire creates a MySQL container and executes SQL scripts to create and populate the tables. This is specified in the AppHost project:

var mysql = builder.AddMySql("sqlsvr2d6")
                   .WithLifetime(ContainerLifetime.Persistent);
                
var db2d6 = mysql.AddDatabase("db2d6");

mysql.WithInitBindMount(source: "../../database/scripts", isReadOnly: false);

When MySQL starts, it looks for SQL files in a specific folder and executes them. Locally, this works because the bind mount is mapped to a local folder with the files.

However, when deployed to Azure:

  • The mounts are created in Azure Storage Files
  • The files are missing!

The Data API

This project uses Data API Builder (dab). Based on a single config file, a full data API is built and hosted in a container.

Locally, Aspire creates a DAB container and reads the JSON config file to create the API. This is specified in the AppHost project:

var dab = builder.AddDataAPIBuilder("dab", ["../../database/dab-config.json"])
                .WithReference(db2d6)
                .WaitFor(db2d6);

But once again, when deployed to Azure, the file is missing. The DAB container starts but fails to find the config file.

Logs of DAB failing to start


The Solution

The solution is simple: the SQL scripts and DAB config file need to be uploaded into Azure Storage Files during deployment.

You can do this by adding a post-provision hook in the azure.yaml file to execute a script that uploads the files. See an example of a post-provision hook in this post.

Alternatively, you can leverage azd alpha features: azd.operations and infraSynth.

  • azd.operations extends the provisioning providers and will upload the files for us.
  • infraSynth generates the IaC files for the entire solution.

💡Note: These features are in alpha and subject to change.

Each azd alpha feature can be turned on individually. To see all features:

azd config list-alpha

To activate the features we need:

azd config set alpha.azd.operations on
azd config set alpha.infraSynth on

Let's Try It

Once the azd.operation feature is activated, any azd up will now upload the files into Azure. If you check the database, you'll see that the db2d6 database was created and populated. Yay!

However, the DAB API will still fail to start. Why? Because, currently, DAB looks for a file, not a folder, when it starts. This can be fixed by modifying the IaC files.

One Last Step: Synthesize the IaC Files

First, let's synthesize the IaC files. These Bicep files describe the required infrastructure for our solution.

With the infraSynth feature activated, run:

azd infra synth

You'll now see a new infra folder under the AppHost project, with YAML files matching the container names. Each file contains the details for creating a container.

Open the dab.tmpl.yaml file to see the DAB API configuration. Look for the volumeMounts section. To help DAB find its config file, add subPath: dab-config.json to make the binding more specific:

containers:
    - image: {{ .Image }}
      name: dab
      env:
        - name: AZURE_CLIENT_ID
          value: {{ .Env.MANAGED_IDENTITY_CLIENT_ID }}
        - name: ConnectionStrings__db2d6
          secretRef: connectionstrings--db2d6
      volumeMounts:
        - volumeName: dab-bm0
          mountPath: /App/dab-config.json
          subPath: dab-config.json
scale:
    minReplicas: 1
    maxReplicas: 1

You can also specify the scaling minimum and maximum number of replicas if you wish.

Now that the IaC files are created, azd will use them. If you run azd up again, the execution time will be much faster—azd deployment is incremental and only does "what changed."

The Final Result

The solution is now fully deployed:

  • The database is there with the data
  • The API works as expected
  • You can use your application!
2D6 Dungeon App deployed


Bonus: Deploying with CI/CD

Want to deploy with CI/CD? First, generate the GitHub Action (or Azure DevOps) workflow with:

azd pipeline config

Then, add a step to activate the alpha feature before the provisioning step in the azure-dev.yml file generated by the previous command.

- name: Extends provisioning providers with azd operations
  run: azd config set alpha.azd.operations on     

With these changes, and assuming the infra files are included in the repo, the deployment will work on the first try.

Conclusion

It's exciting to see how tools like azd are shaping the future of development and deployment. Not only do they make the developer's life easier today by automating complex tasks, but they also ensure you're ready for production with all the necessary Infrastructure as Code (IaC) files in place. The journey from code to cloud has never been smoother!

If you have any questions or feedback, I'm always happy to help—just reach out on your favorite social media platform.

In Video

Here the video version of this blog post.


References


Reading Notes #649

Welcome to another edition of my reading notes! This week I've come across some interesting articles spanning from programming decisions to career tips and AI innovations. Hope you find something valuable in this collection.
minimal photography of a point of land in sandwich between gray sky and water

Programming


AI


Miscellaneous

  • Thriving After a Layoff: The First 30 Days (scrappy girl project) - A friend of mine shared with me this post, and I think it will help me in the upcoming days. So here I am sharing so more can benefit from it

Book cover: The Let Them Theory

Books

  • The Let Them Theory (Mel Robbins) - Stop wasting energy on things you can’t control. It something we all know, but probably forgot the power if it. Let go is hard, but it works in so many scenario. In this book Mel explains how to make it works, and how to use it with a let me action.
     

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank


Reading Notes #648

Welcome to this week's reading notes! Dive into the latest on Microsoft's new AI chat template, explore Docker's MCP CLI, learn about integrating AI into .NET applications, and discover how to automate .NET MAUI library publishing with GitHub Actions. 

Whether you're interested in AI advancements, programming techniques, or DevOps practices, there's something valuable waiting for you below.

MsDevMtl Meetup - Uno
MsDevMtl Meetup - Uno


Suggestion of the week

  • Exploring the new AI chat template (Andrew Lock) - This new .NET template looks pretty useful and has a lot of components already baked in. This post explains those and shows how to customize it for our usage.

Programming


DevOps


AI


Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank


Reading Notes #647


This post is a collection of my latest reading notes, highlighting interesting articles and resources on AI, programming, databases, and more. Each link includes a brief summary of what I found valuable or noteworthy.
screenshot linux shutdown operations

AI


Programming


Databases


Miscellaneous

  • Help yourself to thrive (Salma) - The human body is an extraordinary machine, extremely strong and conciliant, but it also requires a fine turning. Great post, we must learn from it.


Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank

Reading Notes #646

Welcome to this week's collection of fascinating reads across cloud computing, AI, and programming! As technology continues to evolve at breakneck speed, I've gathered some of the most insightful articles that caught my attention. From securing MCP servers to exploring Rust, there's something here for every tech enthusiast. 
Dive in and discover what's new in our rapidly changing digital landscape.

Cloud

AI

Programming

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank

Converting a Blazor WASM to FluentUI Blazor server

TL;DR: This post walks through migrating a Blazor WebAssembly project to FluentUI Blazor server, highlighting key improvements in UI, authentication, and containerization using Azure Container Apps and .NET Aspire.

(👓Version en français ici

Why Migrate?

The migration to FluentUI Blazor server brought three major benefits:

  • 🎨 Modern UI with FluentUI components
  • 🔒 Simplified authentication using Azure Container Apps
  • 🚀 Better development experience with .NET Aspire

In this post, I'm sharing my journey while migrating a Blazor WebAssembly (WASM) project to a FluentUI Blazor server project. The goal was to use the new FluentUI Blazor components library, take advantage of .NET Aspire and be able to execute the project in a container.

Recently, I've been working on the migration of AzUrlShortener. Upgrading SDKs and packages, improving the security, and changing the architecture of the solution. This post is part of a series of posts where I share a few interesting details, tips, and tricks I learned while working on this project.

AzUrlShortener is an Open source project that consist of simple URL shortener that I built a few years ago. The goal was simple: having a budget friendly solution to share short URL that would be secure, easy to use and where the data would stay mine. Each instance is hosted in Azure and consist of an API (Azure Function), an Blazor WebAssembly website (Azure Static Web App), and Data Storage (Azure Storage table).

This post is part of a series about modernizing the AzUrlShortener project:

Migration Strategy: Fresh Start vs. Refactor

When migrating an existing project, you have two options: Editing the existing project to reshaping it into the new type or creating a new project and copy-pasting pieces of code from the old project to the new one. In this case, I chose to create a new project and copy-paste the code. This way, I could keep the old project as a backup in case something went wrong.

Creating a New Project

Like mentioned earlier I wanted to use the new FluentUI Blazor components library. It's an open-source project that provides a set of components for building web applications using the Fluent Design System. It makes it easy to create beautiful and responsive user interfaces that are consistent. To create a new project we can use the available template.

dotnet new fluentblazor -n Cloud5mins.ShortenerTools.TinyBlazorAdmin

Dark Mode & Theming Support 🌙

The one thing I do to all my FluentUI Blazor projects is to add a settings page. This page allows the user to change the theme and color of the application. I should do a template to save time, but until then here the required code to add the settings page.

Settings.razor

Let's start by creating that new page called Settings.razor. With two selects, one for the theme (dark or light) and one for the accent color.

@page "/settings"

@using Microsoft.FluentUI.AspNetCore.Components.Extensions

@rendermode InteractiveServer

<FluentDesignTheme @bind-Mode="@Mode"
				   @bind-OfficeColor="@OfficeColor"
				   StorageName="theme" />

<h3>Settings</h3>

<div>
	<FluentStack Orientation="Orientation.Vertical">
		<FluentSelect   Label="Theme" Width="150px"
						Items="@(Enum.GetValues<DesignThemeModes>())"
						@bind-SelectedOption="@Mode" />
		<FluentSelect   Label="Color"
						Items="@(Enum.GetValues<OfficeColor>().Select(i => (OfficeColor?)i))"
			Height="200px" Width="250px" @bind-SelectedOption="@OfficeColor">
			<OptionTemplate>
				<FluentStack>
					<FluentIcon Value="@(new Icons.Filled.Size20.RectangleLandscape())" Color="Color.Custom"
						CustomColor="@(@context.ToAttributeValue() != "default" ? context.ToAttributeValue() : "#036ac4" )" />
					<FluentLabel>@context</FluentLabel>
				</FluentStack>
			</OptionTemplate>
		</FluentSelect>
	</FluentStack>
</div>

@code {
    public DesignThemeModes Mode { get; set; }
    public OfficeColor? OfficeColor { get; set; }
}

App.razor

In the App it self, we need to some JavaScript and the loading theme component. Just after the </body> tag, we need to add the following code:

<!-- Set the default theme -->

<script src="_content/Microsoft.FluentUI.AspNetCore.Components/js/loading-theme.js" type="text/javascript"></script>

<loading-theme storage-name="theme"></loading-theme>

Imports.razor

I noticed some warning in the code about missing using directives. To fix that, find the line that reference to Components.Icons in the _Imports.razor and change it by the following. The Icons alias should resolve the problem.

@using Icons = Microsoft.FluentUI.AspNetCore.Components.Icons

MainLayout.razor

The main layout is our base page by default. We need to add Mode and OfficeColor to make the accessible to the entire application.

@code {
    public DesignThemeModes Mode { get; set; }
    public OfficeColor? OfficeColor { get; set; }
}

NavMenu.razor

The only thing left is to be able to easily access this new page. This can be done simply by adding an option in the navigation menu.

<FluentNavLink Href="/settings" Match="NavLinkMatch.All" Icon="@(new Icons.Regular.Size20.TextBulletListSquareSettings())">Settings</FluentNavLink>

Test it

And voilà! You should now have a settings page that allows you to change the theme and color of the application. This is all great and it's not really related to the migration, but it's a nice addition to have. Dark mode for the win!

The migration

The migration itself had many little pieces, but wasn't that complex. The project is part of a .NET Aspire solution, so I added the project to the solution dotnet sln add ./Cloud5mins.ShortenerTools.TinyBlazorAdmin. Then added the references to Cloud5mins.ShortenerTools.Core (the class library with all the model, and services) and the ServiceDefault project that was generated when we added Aspire to the solution.

The next logical step was to add our Blazor project the the orchestrator with those lines in the Program.cs of the AppHost project.

builder.AddProject<Projects.Cloud5mins_ShortenerTools_TinyBlazorAdmin>("admin")
	.WithExternalHttpEndpoints()
	.WithReference(manAPI);

This will make sure the TinyBlazorAdmin project starts with a reference to the API and have accessible endpoints. Without the .WithExternalHttpEndpoints() the project wouldn't be accessible once deployed to Azure.

To use the capability of .NET Aspire to orchestrate the different projects, we need to replace our previous HttpClient creation in the Program.cs of the TinyBlazorAdmin by the following code:

builder.Services.AddHttpClient<UrlManagerClient>(client => 
{
    client.BaseAddress = new Uri("https+http://api");
});

This will make sure the UrlManagerClient receives an httpClient using the correct address and port when calling the API. Let's have a look at the UrlManagerClient class and one of the method that will be used to call the API.

public class UrlManagerClient(HttpClient httpClient)
{

	public async Task<IQueryable<ShortUrlEntity>?> GetUrls()
    {
		IQueryable<ShortUrlEntity> urlList = null;
		try{
			using var response = await httpClient.GetAsync("/api/UrlList");
			if(response.IsSuccessStatusCode){
				var urls = await response.Content.ReadFromJsonAsync<ListResponse>();
				urlList = urls!.UrlList.AsQueryable<ShortUrlEntity>();
			}
		}
		catch(Exception ex){
			Console.WriteLine(ex.Message);
		}
        
		return urlList;
    }
	// ...
}

As the code shows the httpClient is injected in the constructor and used to call the API. The GetUrls method is a simple GET request that returns a list of ShortUrlEntity. The API is the one created in a previous post: How to use Azure Storage Table with .NET Aspire and a Minimal API, and all the classes are part of the Cloud5mins.ShortenerTools.Core project.

The URL Grid

Part of the migration was also to replace the Syncfusion grid by the new FluentUI Blazor Grid. Not that Syncfusion controls are not great, quite the contrary, but because the AzUrlShortener project has moved to a different owner, I think it would be better to use components that required no licenses.

For this initial iteration, the Syncfusion grid will be replace by the FluentUI Blazor Grid. In a future iteration the Syncfusion Charts component will also be replace. Thank you Syncfusion for the community license used in this project.

The code of UrlManager.razor changed quite a lot as the to grid were a bit different in there naming and usage. The sorting required a bit more code as the column name are not the same as the property name. To provide an example the "Vanity" column is in fact the RowKey property of the ShortUrlEntity class. To be able to sort the column, we need to create a GridSort object that will be used in the TemplateColumn definition.

Definition of the column in the grid:

<TemplateColumn Title="Vanity" Width="150px" Sortable="true" SortBy="@sortByVanities">
    <FluentAnchor Href="@context!.ShortUrl" Target="_blank" Aearance="Appearance.Hypeext">@context!.RowKey</FluentAnchor>
</TemplateColumn>

Definition of the GridSort object:

GridSort<ShortUrlEntity> sortByVanities = GridSort<ShortUrlEntity>.ByAscending(p => p.RowKey);

One big improvement that could be done in the future would be to use the virtual grid. The virtual grid is a great way to improve the performance of the grid when dealing with large amount of data as it only loads the data that is visible on the screen. I show how to use the virtual grid in a previous post: How use a Blazor QuickGrid with GraphQL.

Removing the fake popup div

One of the FluentUI Blazor component is the FluentUIDialogue. This component is used to display a popup window, and will help us keeping the code more structure and clean. Instead of having <div> in the code, we can typed <FluentUIDialog> and it will be rendered as a popup.

var dialog = await DialogService.ShowDialogAsync<NewUrlDialog>(shortUrlRequest, new DialogParameters()
	{
		Title = "Create a new Short Url",
		PreventDismissOnOverlayClick = true,
		PreventScroll = true
	});




Replacing the Authentication

Instead of having to implementing the authentication in the Blazor project, we will be using the a feature of Azure Container Apps that required no code changes! You don't need to change a single line of code to secure your application deployed on Azure Container Apps (ACA)! Instead, your application is automatically protected simply by enabling the authentication feature, called EasyAuth.

Once the solution is deployed to Azure the TinyBlazorAdmin will be installed in a container app named "admin". To secured it, navigate to the Azure Portal, and select the Container App you want to secure. In this case, it will be the "admin" container app. From the left menu, select Authentication and click Add identity provider.

You can choose between multiple providers, but let's use Microsoft since it's deployed in Azure and you are already logged in. Once Microsoft is chosen, you will see many configuration options. Select the recommended client secret expiration (e.g., 180 days). You can keep all the other default settings. Click Add. After a few seconds, you should see a notification in the top right corner that the identity provider was added successfully.

Voila! Your app now has authentication. Next time you navigate to the app, you will be prompted to log in with your Microsoft account. Notice that your entire app is protected. No page is accessible without authentication.

Conclusion

The migration from Blazor WebAssembly to FluentUI Blazor Server has been a successful journey that brought several meaningful improvements to the project:

  • Enhanced user interface with modern FluentUI components
  • Cleaner, more maintainable code structure
  • Simplified authentication using Azure Container Apps' EasyAuth
  • Improved local development experience with .NET Aspire orchestration

The end result is a modern, containerized application that's both easier to maintain and more pleasant to use. The addition of dark mode support and theming capabilities are great improvements to the user experience.

Want to Learn more?

To learn more about Azure Container Apps I strongly suggest this repository: Getting Started .NET on Azure Container Apps, it contains many step-by-step tutorials (with videos) to learn how to use Azure Container Apps with .NET.

Have questions about the migration process or want to share your own experiences with FluentUI Blazor? Feel free to reach out to me on @fboucheros.bsky.social or open an issue on the AzUrlShortener GitHub repository. I'd love to hear your thoughts!


References

Reading Notes #645

Welcome to this week's reading notes! I've found some great articles that caught my eye - from security tips for MCP servers to exciting updates in Rust and AI. Whether you're into cloud services, programming tools, or wondering about the future of coding with AI, there's something here for you.

Let's dive in!

 

Programming

  • The Aspire Compiler (David Fowler) - I really appreciate Aspire. It's one of the tools that completely changed my experience as a developer. Learning more about it is, without a doubt, interesting.

  • Verifying tricky git rebases with git range-diff (Andrew Lock) - Is it possible to really master Git? There is always something new to learn. Nice post going deep.

  • Azure SDK for Rust Goes Beta (Nikos Vaggalis ) - Great news for the Rust developer, this is an important milestone.

AI



Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank


Making AI smarter with an MCP server that manages short URLs

Have you ever wanted to give your AI assistants access to your own custom tools and data? That's exactly what Model Context Protocol (MCP) allows us to do, and I've been experimenting with it lately.

(Version française ici)

I read a lot recently about Model Context Protocol (MCP) and how it is changing the way AI interacts with external systems. I was curious to see how it works and how I can use it in my own projects. There are many tutorial available online but one of my favorite was written by James Montemagno Build a Model Context Protocol (MCP) server in C#. This post isn't a tutorial, but rather a summary of my experience and what I learned along the way while building a real MCP server that manages short URLs.

MCP doesn't change AI itself, it's a protocol that helps your AI model to interact with external resources: API, databases, etc. The protocol simplifies the way AI can access an external system, and it allows the AI to discover the available tools from those resources. Recently I was working on a project that manages short URLs, and I thought it would be a great opportunity to build an MCP server that manages short URLs. I wanted to see how easy it is to build and then use it in VSCode with GitHub Copilot Chat.

Code: All the code of this post is available in the branch exp/mcp-server of the AzUrlShortener repo on GitHub.

Setting Up: Adding an MCP Server to a .NET Aspire Solution

The AzUrlShortener is a web solution that uses .NET Aspire, so the first thing I did was create a new project using the command:

dotnet new web -n Cloud5mins.ShortenerTools.MCPServer -o ./mcpserver

Required Dependencies

To transform this into an MCP server, I added these essential NuGet packages:

  • Microsoft.Extensions.Hosting
  • ModelContextProtocol.AspNetCore

Since this project is part of a .NET Aspire solution, I also added references to:

  • The ServiceDefaults project (for consistent service configuration)
  • The ShortenerTools.Core project (where the business logic lives)

Integrating with Aspire

Next, I needed to integrate the MCP server into the AppHost project, which defines all services in our solution. Here's how I added it to the existing services:

var manAPI = builder.AddProject<Projects.Cloud5mins_ShortenerTools_Api>("api")
						.WithReference(strTables)
						.WaitFor(strTables)
						.WithEnvironment("CustomDomain",customDomain)
						.WithEnvironment("DefaultRedirectUrl",defaultRedirectUrl);

builder.AddProject<Projects.Cloud5mins_ShortenerTools_TinyBlazorAdmin>("admin")
		.WithExternalHttpEndpoints()
		.WithReference(manAPI);

// 👇👇👇 new code for MCP Server
builder.AddProject<Projects.Cloud5mins_ShortenerTools_MCPServer>("mcp")
		.WithReference(manAPI)
		.WithExternalHttpEndpoints();

Notice how I added the MCP server with a reference to the manAPI - this is crucial as it needs access to the URL management API.

Configuring the MCP Server

To complete the setup, I needed to configure the dependency injection in the program.cs file of the MCPServer project. The key part was specifying the BaseAddress of the httpClient:

var builder = WebApplication.CreateBuilder(args);       
builder.Logging.AddConsole(consoleLogOptions =>
{
    // Configure all logs to go to stderr
    consoleLogOptions.LogToStandardErrorThreshold = LogLevel.Trace;
});
builder.Services.AddMcpServer()
    .WithTools<UrlShortenerTool>();

builder.AddServiceDefaults();

builder.Services.AddHttpClient<UrlManagerClient>(client => 
            {
                client.BaseAddress = new Uri("https+http://api");
            });
            
var app = builder.Build();

app.MapMcp();

app.Run();

That's all that was needed! Thanks to .NET Aspire, integrating the MCP server was straightforward. When you run the solution, the MCP server starts alongside other projects and will be available at http://localhost:{some port}/sse. The /sse part of the endpoint means (Server-Sent Events) and is critical - it's the URL that AI assistants will use to discover available tools.

Implementing the MCP Server Tools

Looking at the code above, two key lines make everything work:

  1. builder.Services.AddMcpServer().WithTools<UrlShortenerTool>(); - registers the MCP server and specifies which tools will be available
  2. app.MapMcp(); - maps the MCP server to the ASP.NET Core pipeline

Defining Tools with Attributes

The UrlShortenerTool class contains all the methods that will be exposed to AI assistants. Let's examine the ListUrl method:

[McpServerTool, Description("Provide a list of all short URLs.")]
public List<ShortUrlEntity> ListUrl()
{
	var urlList = _urlManager.GetUrls().Result.ToList<ShortUrlEntity>();
	return urlList;
}

The [McpServerTool] attribute marks this method as a tool the AI can use. I prefer keeping tool definitions simple, delegating the actual implementation to the UrlManager class that's injected in the constructor: UrlShortenerTool(UrlManagerClient urlManager).

The URL Manager Client

The UrlManagerClient follows standard HttpClient patterns. It receives the pre-configured httpClient in its constructor and uses it to communicate with the API:

public class UrlManagerClient(HttpClient httpClient)
{
	public async Task<IQueryable<ShortUrlEntity>?> GetUrls()
    {
		IQueryable<ShortUrlEntity> urlList = null;
		try{
			using var response = await httpClient.GetAsync("/api/UrlList");
			if(response.IsSuccessStatusCode){
				var urls = await response.Content.ReadFromJsonAsync<ListResponse>();
				urlList = urls!.UrlList.AsQueryable<ShortUrlEntity>();
			}
		}
		catch(Exception ex){
			Console.WriteLine(ex.Message);
		}
        
		return urlList;
    }

	// other methods to manage short URLs
}

This separation of concerns keeps the code clean - tools handle the MCP interface, while the client handles the API communication.

Using the MCP Server with GitHub Copilot Chat

Now for the exciting part - connecting your MCP server to GitHub Copilot Chat! This is where you'll see your custom tools in action.

Configuring Copilot to Use Your MCP Server

Once the server is running (either deployed in Azure or locally), follow these steps:

  1. Open GitHub Copilot Chat in VS Code
  2. Change the mode to Agent by clicking the dropdown in the chat panel
  3. Click the Select Tools... button, then Add More Tools
Set GitHub Copilot mode to Agent

Selecting the Connection Type

GitHub Copilot supports several ways to connect to MCP servers:

All MCP Server types

There are multiple options available - you could have your server in a container or run it via command line. For our scenario, we'll use HTTP.

Note: At the time of writing this post, I needed to use the HTTP URL of the MCP server rather than HTTPS. You can get this URL from the Aspire dashboard by clicking on the resource and checking the available Endpoints.

After selecting your connection type, Copilot will display the configuration file, which you can modify anytime.

GitHub Copilot Chat Configuration

Interacting with Your Custom Tools

Now comes the fun part! You can interact with your MCP server in two ways:

  1. Natural language queries: Ask questions like "How many short URLs do I have?"
  2. Direct tool references: Use the pound sign to call specific tools: "With #azShortURL list all URLs"

The azShortURL is the name we gave to our MCP server in the configuration.

GitHub Copilot question and response example


Key Learnings and Future Directions

Building this MCP server for AzUrlShortener taught me several valuable lessons:

What Worked Well

  • Integration with .NET Aspire was remarkably straightforward
  • The attribute-based approach to defining tools is clean and intuitive
  • The separation of tool definitions from implementation logic keeps the code maintainable

Challenges and Considerations

  • The csharp-SDK is only a few weeks old and still in preview
  • OAuth authentication isn't defined yet (though it's being actively worked on)
  • Documentation is present but evolving rapidly as the technology matures, so some features may not be fully documented yet

For the AzUrlShortener project specifically, I'm keeping this MCP server implementation in the experimental branch mcp-server until I can properly secure it. However, I'm already envisioning numerous other scenarios where MCP servers could add great value.

If you're interested in exploring this technology, I encourage you to:

  • Check out the GitHub repo
  • Fork it and create your own MCP server
  • Experiment with different tools and capabilities

Join the Community

If you have questions or want to share your experiences with others, I invite you to join the Azure AI Community Discord server:

Join Azure AI Community Discord

The MCP ecosystem is growing rapidly, and it's an exciting time to be part of this community!


~Frank


Reading Notes #644

This post gathers my recent reading notes on artificial intelligence, programming, and a few inspiring podcasts. It includes links to articles, tutorials, and fascinating discussions. Whether you're interested in the latest AI developments, .NET tools, or modern architectures, there's plenty here to spark your curiosity. 


Happy reading!

AI


Programming


Podcasts

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank