Showing posts with label copilot. Show all posts
Showing posts with label copilot. Show all posts

I Co-Wrote 88 Unit Tests Using AI: A Developer's Journey

Testing has always been one of those tasks that developers know is essential but often find tedious. When I decided to add comprehensive unit tests to my NoteBookmark project, I thought: why not make this an experiment in AI-assisted development? What followed was a fascinating 4-hour journey that resulted in 88 unit tests, a complete CI/CD pipeline, and some valuable insights about working with AI coding assistants.

(Version française ici)

The Project: NoteBookmark

NoteBookmark is a .NET application built with C# that helps users manage and organize their reading notes and bookmarks. The project includes an API, a Blazor frontend, and uses Azure services for storage. You can check out the complete project on GitHub.

The Challenge: Starting from Zero

I'll be honest - it had been a while since I'd written comprehensive unit tests. Rather than diving in myself, I decided to see how different AI models would approach this task. My initial request was deliberately vague: "add a test project" without any other specifications.

Looking back, I realize I should have been more specific about which parts of the code I wanted covered. This would have made the review process easier and given me better control over the scope. But sometimes, the best learning comes from letting the AI surprise you.

The Great AI Model Comparison



GPT-4.1: Competent but Quiet

GPT-4.1 delivered decent results, but the experience felt somewhat mechanical. The code it generated was functional, but I found myself wanting more context. The explanations were minimal, and I often had to ask follow-up questions to understand the reasoning behind certain test approaches.

Gemini: The False Start

My experience with Gemini was... strange. Perhaps it was a glitch or an off day, but most of what was generated simply didn't work. I didn't persist with this model for long, as debugging AI-generated code that fundamentally doesn't function defeats the purpose of the exercise. Note that at the time of this writing, Gemini was still in preview, so I expect it to improve over time.

Claude Sonnet: The Clear Winner

This is where the magic happened. Claude Sonnet became my co-pilot of choice for this project. What set it apart wasn't just the quality of the code (though that was excellent), but the quality of the conversation. It felt like having a thoughtful colleague thinking out loud with me.

The explanations were clear and educational. When Claude suggested a particular testing approach, it would explain why. When it encountered a complex scenario, it would walk through its reasoning. I tried different versions of Claude Sonnet but didn't notice significant differences in results - they were all consistently good.

The Development Process: A 4-Hour Journey


Hour 1-2: Getting to Compilation

The first iteration couldn't compile. This wasn't surprising given the complexity of the codebase and the vague initial request. But here's where the AI collaboration really shined. Instead of manually debugging everything myself, I worked with Copilot to identify and fix issues iteratively.

We went through several rounds of:

  1. Identify compilation errors
  2. Discuss the best approach to fix them
  3. Let the AI implement the fixes
  4. Review and refine

After about 2 hours, we had a test project with 88 unit tests that compiled successfully. The AI had chosen xUnit as the testing framework, which I was happy with - it's a solid choice that I might not have picked myself if I was rusty on the current .NET testing landscape.

Hour 2.5-3.5: Making Tests Pass

Getting the tests to compile was one thing; getting them to pass was another challenge entirely. This phase taught me a lot about both my codebase and xUnit features I wasn't familiar with.

I relied heavily on the /explain feature during this phase. When tests failed, I'd ask Claude to explain what was happening and why. This was invaluable for understanding not just the immediate fix, but the underlying testing concepts.

One of those moment was learning about [InlineData(true)] and other xUnit data attributes. These weren't features I was familiar with, and having them explained in context made them immediately useful.


InlineData in the code


Hour 3.5-4: Structure and Style

Once all tests were passing, I spent time ensuring I understood each test and requesting structural changes to match my preferences. This phase was crucial for taking ownership of the code. Just because AI wrote it doesn't mean it should remain a black box. Let's repeat this: Understanding the code is essential; just because AI wrote it doesn't mean it's good.

Beyond Testing: CI/CD Integration

With the tests complete, I asked Copilot to create a GitHub Actions workflow to run tests on every push to main and v-next branches, plus PR reviews. Initially it started modifiying my existing workflow that takess care of the Azure deployment. I wanted a separate workflow for testing, so I interrupted (that's nice I wasn't "forced" to wait), and asked it to create a new one instead. The result was the running-unit-tests.yml workflow that worked perfectly on the first try.

This was genuinely surprising. CI/CD configurations often require tweaking, but the generated workflow handled:

  • Multi-version .NET setup
  • Dependency restoration
  • Building and testing
  • Test result reporting
  • Code coverage analysis
  • Artifact uploading

Code coverage


The PR Enhancement Adventure

Here's where things got interesting. When I asked Copilot to enhance the workflow to show test results in PRs, it started adding components, then paused and asked if it could delete the current version and start from scratch.

I said yes, and I'm glad I did. The rebuilt version created beautiful PR comments showing:

  • Test results summary
  • Code coverage reports (which I didn't ask for but appreciated)
  • Detailed breakdowns.

PR display


The Finishing Touches

No project is complete without proper status indicators. I added a test status badge to the README, giving anyone visiting the repository immediate visibility into the project's health.

test status badge


Key Takeaways


What Worked Well

  1. AI as a Learning Partner: Having Copilot explain testing concepts and xUnit features was like having a patient teacher
  2. Iterative Refinement: The back-and-forth process felt natural and productive
  3. Comprehensive Solutions: The AI didn't just write tests; it created a complete testing infrastructure
  4. Quality Over Speed: While it took 4 hours, the result was thorough and well-structured

What I'd Do Differently

  1. Be More Specific Initially: Starting with clearer scope would have streamlined the process
  2. Set Testing Priorities: Identifying critical paths first would have been valuable
  3. Plan for Visual Test Reports: Thinking about test result visualization from the start

Lessons About AI Collaboration

  • Model Choice Matters: The difference between AI models was significant
  • Conversation Quality Matters: Clear explanations make the collaboration more valuable
  • Trust but Verify: Understanding every piece of generated code is crucial
  • Embrace Iteration: The best results come from multiple refinement cycles

The Bigger Picture

This experiment reinforced my belief that AI coding assistants are most powerful when they're true collaborators rather than code generators. The value wasn't just in the 88 tests that were written, but in the learning that happened along the way.

For developers hesitant about AI assistance in testing: this isn't about replacing your testing skills, it's about augmenting them. The AI handles the boilerplate and suggests patterns, but you bring the domain knowledge and quality judgment.

Conclusion

Would I do this again? Absolutely. The combination of comprehensive test coverage, learning opportunities, and time efficiency made this a clear win. The 4 hours invested created not just tests, but a complete testing infrastructure that will pay dividends throughout the project's lifecycle.

If you're considering AI-assisted testing for your own projects, my advice is simple: start the conversation, be prepared to iterate, and don't be afraid to ask "why" at every step. The goal isn't just working code - it's understanding and owning that code.

The complete test suite and CI/CD pipeline are available in the NoteBookmark repository if you want to see the results of this AI collaboration in action.


Reading Notes #646

Welcome to this week's collection of fascinating reads across cloud computing, AI, and programming! As technology continues to evolve at breakneck speed, I've gathered some of the most insightful articles that caught my attention. From securing MCP servers to exploring Rust, there's something here for every tech enthusiast. 
Dive in and discover what's new in our rapidly changing digital landscape.

Cloud

AI

Programming

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

~Frank

Making AI smarter with an MCP server that manages short URLs

Have you ever wanted to give your AI assistants access to your own custom tools and data? That's exactly what Model Context Protocol (MCP) allows us to do, and I've been experimenting with it lately.

(Version française ici)

I read a lot recently about Model Context Protocol (MCP) and how it is changing the way AI interacts with external systems. I was curious to see how it works and how I can use it in my own projects. There are many tutorial available online but one of my favorite was written by James Montemagno Build a Model Context Protocol (MCP) server in C#. This post isn't a tutorial, but rather a summary of my experience and what I learned along the way while building a real MCP server that manages short URLs.

MCP doesn't change AI itself, it's a protocol that helps your AI model to interact with external resources: API, databases, etc. The protocol simplifies the way AI can access an external system, and it allows the AI to discover the available tools from those resources. Recently I was working on a project that manages short URLs, and I thought it would be a great opportunity to build an MCP server that manages short URLs. I wanted to see how easy it is to build and then use it in VSCode with GitHub Copilot Chat.

Code: All the code of this post is available in the branch exp/mcp-server of the AzUrlShortener repo on GitHub.

Setting Up: Adding an MCP Server to a .NET Aspire Solution

The AzUrlShortener is a web solution that uses .NET Aspire, so the first thing I did was create a new project using the command:

dotnet new web -n Cloud5mins.ShortenerTools.MCPServer -o ./mcpserver

Required Dependencies

To transform this into an MCP server, I added these essential NuGet packages:

  • Microsoft.Extensions.Hosting
  • ModelContextProtocol.AspNetCore

Since this project is part of a .NET Aspire solution, I also added references to:

  • The ServiceDefaults project (for consistent service configuration)
  • The ShortenerTools.Core project (where the business logic lives)

Integrating with Aspire

Next, I needed to integrate the MCP server into the AppHost project, which defines all services in our solution. Here's how I added it to the existing services:

var manAPI = builder.AddProject<Projects.Cloud5mins_ShortenerTools_Api>("api")
						.WithReference(strTables)
						.WaitFor(strTables)
						.WithEnvironment("CustomDomain",customDomain)
						.WithEnvironment("DefaultRedirectUrl",defaultRedirectUrl);

builder.AddProject<Projects.Cloud5mins_ShortenerTools_TinyBlazorAdmin>("admin")
		.WithExternalHttpEndpoints()
		.WithReference(manAPI);

// 👇👇👇 new code for MCP Server
builder.AddProject<Projects.Cloud5mins_ShortenerTools_MCPServer>("mcp")
		.WithReference(manAPI)
		.WithExternalHttpEndpoints();

Notice how I added the MCP server with a reference to the manAPI - this is crucial as it needs access to the URL management API.

Configuring the MCP Server

To complete the setup, I needed to configure the dependency injection in the program.cs file of the MCPServer project. The key part was specifying the BaseAddress of the httpClient:

var builder = WebApplication.CreateBuilder(args);       
builder.Logging.AddConsole(consoleLogOptions =>
{
    // Configure all logs to go to stderr
    consoleLogOptions.LogToStandardErrorThreshold = LogLevel.Trace;
});
builder.Services.AddMcpServer()
    .WithTools<UrlShortenerTool>();

builder.AddServiceDefaults();

builder.Services.AddHttpClient<UrlManagerClient>(client => 
            {
                client.BaseAddress = new Uri("https+http://api");
            });
            
var app = builder.Build();

app.MapMcp();

app.Run();

That's all that was needed! Thanks to .NET Aspire, integrating the MCP server was straightforward. When you run the solution, the MCP server starts alongside other projects and will be available at http://localhost:{some port}/sse. The /sse part of the endpoint means (Server-Sent Events) and is critical - it's the URL that AI assistants will use to discover available tools.

Implementing the MCP Server Tools

Looking at the code above, two key lines make everything work:

  1. builder.Services.AddMcpServer().WithTools<UrlShortenerTool>(); - registers the MCP server and specifies which tools will be available
  2. app.MapMcp(); - maps the MCP server to the ASP.NET Core pipeline

Defining Tools with Attributes

The UrlShortenerTool class contains all the methods that will be exposed to AI assistants. Let's examine the ListUrl method:

[McpServerTool, Description("Provide a list of all short URLs.")]
public List<ShortUrlEntity> ListUrl()
{
	var urlList = _urlManager.GetUrls().Result.ToList<ShortUrlEntity>();
	return urlList;
}

The [McpServerTool] attribute marks this method as a tool the AI can use. I prefer keeping tool definitions simple, delegating the actual implementation to the UrlManager class that's injected in the constructor: UrlShortenerTool(UrlManagerClient urlManager).

The URL Manager Client

The UrlManagerClient follows standard HttpClient patterns. It receives the pre-configured httpClient in its constructor and uses it to communicate with the API:

public class UrlManagerClient(HttpClient httpClient)
{
	public async Task<IQueryable<ShortUrlEntity>?> GetUrls()
    {
		IQueryable<ShortUrlEntity> urlList = null;
		try{
			using var response = await httpClient.GetAsync("/api/UrlList");
			if(response.IsSuccessStatusCode){
				var urls = await response.Content.ReadFromJsonAsync<ListResponse>();
				urlList = urls!.UrlList.AsQueryable<ShortUrlEntity>();
			}
		}
		catch(Exception ex){
			Console.WriteLine(ex.Message);
		}
        
		return urlList;
    }

	// other methods to manage short URLs
}

This separation of concerns keeps the code clean - tools handle the MCP interface, while the client handles the API communication.

Using the MCP Server with GitHub Copilot Chat

Now for the exciting part - connecting your MCP server to GitHub Copilot Chat! This is where you'll see your custom tools in action.

Configuring Copilot to Use Your MCP Server

Once the server is running (either deployed in Azure or locally), follow these steps:

  1. Open GitHub Copilot Chat in VS Code
  2. Change the mode to Agent by clicking the dropdown in the chat panel
  3. Click the Select Tools... button, then Add More Tools
Set GitHub Copilot mode to Agent

Selecting the Connection Type

GitHub Copilot supports several ways to connect to MCP servers:

All MCP Server types

There are multiple options available - you could have your server in a container or run it via command line. For our scenario, we'll use HTTP.

Note: At the time of writing this post, I needed to use the HTTP URL of the MCP server rather than HTTPS. You can get this URL from the Aspire dashboard by clicking on the resource and checking the available Endpoints.

After selecting your connection type, Copilot will display the configuration file, which you can modify anytime.

GitHub Copilot Chat Configuration

Interacting with Your Custom Tools

Now comes the fun part! You can interact with your MCP server in two ways:

  1. Natural language queries: Ask questions like "How many short URLs do I have?"
  2. Direct tool references: Use the pound sign to call specific tools: "With #azShortURL list all URLs"

The azShortURL is the name we gave to our MCP server in the configuration.

GitHub Copilot question and response example


Key Learnings and Future Directions

Building this MCP server for AzUrlShortener taught me several valuable lessons:

What Worked Well

  • Integration with .NET Aspire was remarkably straightforward
  • The attribute-based approach to defining tools is clean and intuitive
  • The separation of tool definitions from implementation logic keeps the code maintainable

Challenges and Considerations

  • The csharp-SDK is only a few weeks old and still in preview
  • OAuth authentication isn't defined yet (though it's being actively worked on)
  • Documentation is present but evolving rapidly as the technology matures, so some features may not be fully documented yet

For the AzUrlShortener project specifically, I'm keeping this MCP server implementation in the experimental branch mcp-server until I can properly secure it. However, I'm already envisioning numerous other scenarios where MCP servers could add great value.

If you're interested in exploring this technology, I encourage you to:

  • Check out the GitHub repo
  • Fork it and create your own MCP server
  • Experiment with different tools and capabilities

Join the Community

If you have questions or want to share your experiences with others, I invite you to join the Azure AI Community Discord server:

Join Azure AI Community Discord

The MCP ecosystem is growing rapidly, and it's an exciting time to be part of this community!


~Frank


Reading Notes #642

This week, I explored posts about improving cache management for ASP.NET Core applications and understanding error handling in Blazor. These articles, along with others on AI model selection and development productivity, offer valuable insights for developers.


Cloud


Programming


AI


Miscellaneous



Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 


If you have interesting content, share it!

~Frank

Reading Notes #638

Welcome to Reading Notes —a curated dive into the latest and greatest in programming, cloud, and AI. From mastering multithreading with Azure to exploring GitHub Copilot's productivity potential, this collection is brimming with knowledge. Let's unravel what's new, innovative, and worth your attention!


Cloud


Programming

AI

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it! 

~frank


Reading Notes #627


This week, I stumbled upon some fascinating reads. From the announcement of .NET 9 and its incredible versatility to an intriguing new type of failover for Azure Storage, there's plenty to explore. Discover how to get .NET 9 running on your Raspberry Pi, check out the latest Blazorise update, and delve into the power of GitHub Models in .NET with Semantic Kernel. Plus, don't miss out on the introduction of GitHub Copilot for Azure and a new season of AI-related sessions in Visual Studio. And for my fellow open-source enthusiasts, the .NET Aspire Community Toolkit is a game-changer. 

Dive in and let's geek out together! 🌟

Suggestion of the week

  • Announcing .NET 9 - .NET Blog (.NET Team) - You can build anything with C# (aka .NET) and I love it! With runs everywhere, it's open source, it's fast and free!

Cloud

Programming

  • Install and use Microsoft Dot NET 9 with the Raspberry Pi (Pete Codes) - C# everywhere! I love it! I do have some code that run on a Pi as a mini server, bubi need to have a look for a IoT library that could be used.

  • Blazorise v1.7 (Mladen Macanović) - New version of a nice looking CSS Framework for our Blazor website with more features and better performance.

AI

Data

Open Source

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.
If you have interesting content, share it!


~ Frank


Reading Notes #625

Welcome to another edition of Reading Notes! This week, dive into the latest updates on Azure DevOps, Docker best practices, System.Text.Json enhancements in .NET 9, AI innovations from GitHub Universe, and more. 

Enjoy your reading!

Cloud

Programming

AI

  • GitHub Spark (Devon Rifkin, Terkel Gjervig Nielsen, Cole Bemis, Alice Li) - Fascinating news from GitHub Universe. A new spin on the lowcode app but with code. Looking forward to trying it and see what I can build with it.

  • GitHub Copilot in Windows Terminal (Christopher Nguyen) - There it is, Copilot making his entrance into our beloved Terminal. It's only in version Canary for now, but I'm sure it will help many of us when no sure what command to use, or the equivalent bash/ PowerShell.

Miscellaneous


Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share it!


~ Frank


Reading Notes #624

Dive into this week's fascinating mix of tech insights, troubleshooting tales, and productivity tips. From the latest in Azure Dev tools to real-world debugging adventures and cutting-edge .NET innovations, there's something for everyone.

Happy reading!

Cloud

Programming

Podcast

  • Microsoft Playwright Testing with Debbie O'Brien (.NET Rocks!) - Great tool to help making tests on our websites. It's open source and now support .NET.

  • Inspektor Gadget (DevOps and Docker Talk: Cloud Native Interviews and Tooling) - THe first time I heard about Inspektor Gadget was in an episode of Open at Microsoft. I don't use much Kubernetes, but that will be part of my toolbox when I do. Great security, troubleshooting, and observability utility.

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share it!


~ Frank


Reading Notes #622

Welcome to this week’s reading notes! In this post, you’ll find a curated selection of insightful articles and tutorials covering various topics in technology and programming. Whether you’re looking to enhance your testing skills with .NET Aspire, improve your code comprehension with GitHub Copilot, or explore the world of Docker for DevOps, there’s something here for everyone. Dive in and enjoy these valuable resources!

If you have interesting content, share it!

Suggestion of the week

Cloud

Programming

LowCode

Miscellaneous

  • Hosting a (DevOpsDays) Tech Conference (Dewan Ahmed) - I went to this even and you could feel it was prepared with patio and care. It very interesting to learn about the behind the scene and all the work put both before and after.DevOpsDay Halifax you won my heart.

~ Frank



Reading Notes #620

It's Reading Notes time and this week we learn how to improve our experience and security while using AI and containers
 a zip line spider?

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

Cloud

AI

Programming

~Frank

Reading Notes #590

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

Miscellaneous

  • Introducing Sudo for Windows! (Jordi Adoumie) - Wow! This is a really good new feature. How many time a forgot to start my terminal as admin and needed to starry over again...Looking forward to try it.
~Frank


Reading Notes #587

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

 

If you think you may have interesting content, share it!

Cloud

Low Code

Programming

Miscellaneous

~Frank

Reading Notes #582

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

 If you think you may have interesting content, share it!
Fridge with wings, realistic rendering 


 

Cloud

Data


Programming


~Frank

Reading Notes #581

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Suggestion of the week

Programming

Databases

Miscellaneous


~Frank


Reading Notes #570

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 


If you think you may have interesting content, share it!

 

Programming

Open Source

Low Code

Miscellaneous


~Enjoy!

Reading Notes #568


It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

 If you think you may have interesting content, share it!

Programming






Data


Low Code


Open Source


~Frank

Reading Notes #562


It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you think you may have interesting content, share it!

Cloud

Programming

Open Source

  • How to write a perfect README for your GitHub project (Marc Seitz) - This is a nice and simple guide to make sure your readme help visitor to land well on your project.

  • 5 Ways OpenTelemetry Can Reduce Costs (Morgan McLean) - Great tools from the open-source community to help save money.

  • Open at Microsoft – OmniBOR (Aeva Black) - Did you know there is a tool that can help you see the security flaws or your dependencies? Learn more about this project in this post and video.

  • Dapr (AugustaUd) - Learn more about Dapr with this series of 3 videos that each contains short demos. With a very active community, there is no dough that OSS project is healthy.

Low Code

Miscellaneous

~Frank

Reading Notes #561

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

The suggestion of the week

Programming

Low Code

Miscellaneous

~Frank