Showing posts with label oss. Show all posts
Showing posts with label oss. Show all posts

Making AI smarter with an MCP server that manages short URLs

Have you ever wanted to give your AI assistants access to your own custom tools and data? That's exactly what Model Context Protocol (MCP) allows us to do, and I've been experimenting with it lately.

(Version française ici)

I read a lot recently about Model Context Protocol (MCP) and how it is changing the way AI interacts with external systems. I was curious to see how it works and how I can use it in my own projects. There are many tutorial available online but one of my favorite was written by James Montemagno Build a Model Context Protocol (MCP) server in C#. This post isn't a tutorial, but rather a summary of my experience and what I learned along the way while building a real MCP server that manages short URLs.

MCP doesn't change AI itself, it's a protocol that helps your AI model to interact with external resources: API, databases, etc. The protocol simplifies the way AI can access an external system, and it allows the AI to discover the available tools from those resources. Recently I was working on a project that manages short URLs, and I thought it would be a great opportunity to build an MCP server that manages short URLs. I wanted to see how easy it is to build and then use it in VSCode with GitHub Copilot Chat.

Code: All the code of this post is available in the branch exp/mcp-server of the AzUrlShortener repo on GitHub.

Setting Up: Adding an MCP Server to a .NET Aspire Solution

The AzUrlShortener is a web solution that uses .NET Aspire, so the first thing I did was create a new project using the command:

dotnet new web -n Cloud5mins.ShortenerTools.MCPServer -o ./mcpserver

Required Dependencies

To transform this into an MCP server, I added these essential NuGet packages:

  • Microsoft.Extensions.Hosting
  • ModelContextProtocol.AspNetCore

Since this project is part of a .NET Aspire solution, I also added references to:

  • The ServiceDefaults project (for consistent service configuration)
  • The ShortenerTools.Core project (where the business logic lives)

Integrating with Aspire

Next, I needed to integrate the MCP server into the AppHost project, which defines all services in our solution. Here's how I added it to the existing services:

var manAPI = builder.AddProject<Projects.Cloud5mins_ShortenerTools_Api>("api")
						.WithReference(strTables)
						.WaitFor(strTables)
						.WithEnvironment("CustomDomain",customDomain)
						.WithEnvironment("DefaultRedirectUrl",defaultRedirectUrl);

builder.AddProject<Projects.Cloud5mins_ShortenerTools_TinyBlazorAdmin>("admin")
		.WithExternalHttpEndpoints()
		.WithReference(manAPI);

// 👇👇👇 new code for MCP Server
builder.AddProject<Projects.Cloud5mins_ShortenerTools_MCPServer>("mcp")
		.WithReference(manAPI)
		.WithExternalHttpEndpoints();

Notice how I added the MCP server with a reference to the manAPI - this is crucial as it needs access to the URL management API.

Configuring the MCP Server

To complete the setup, I needed to configure the dependency injection in the program.cs file of the MCPServer project. The key part was specifying the BaseAddress of the httpClient:

var builder = WebApplication.CreateBuilder(args);       
builder.Logging.AddConsole(consoleLogOptions =>
{
    // Configure all logs to go to stderr
    consoleLogOptions.LogToStandardErrorThreshold = LogLevel.Trace;
});
builder.Services.AddMcpServer()
    .WithTools<UrlShortenerTool>();

builder.AddServiceDefaults();

builder.Services.AddHttpClient<UrlManagerClient>(client => 
            {
                client.BaseAddress = new Uri("https+http://api");
            });
            
var app = builder.Build();

app.MapMcp();

app.Run();

That's all that was needed! Thanks to .NET Aspire, integrating the MCP server was straightforward. When you run the solution, the MCP server starts alongside other projects and will be available at http://localhost:{some port}/sse. The /sse part of the endpoint means (Server-Sent Events) and is critical - it's the URL that AI assistants will use to discover available tools.

Implementing the MCP Server Tools

Looking at the code above, two key lines make everything work:

  1. builder.Services.AddMcpServer().WithTools<UrlShortenerTool>(); - registers the MCP server and specifies which tools will be available
  2. app.MapMcp(); - maps the MCP server to the ASP.NET Core pipeline

Defining Tools with Attributes

The UrlShortenerTool class contains all the methods that will be exposed to AI assistants. Let's examine the ListUrl method:

[McpServerTool, Description("Provide a list of all short URLs.")]
public List<ShortUrlEntity> ListUrl()
{
	var urlList = _urlManager.GetUrls().Result.ToList<ShortUrlEntity>();
	return urlList;
}

The [McpServerTool] attribute marks this method as a tool the AI can use. I prefer keeping tool definitions simple, delegating the actual implementation to the UrlManager class that's injected in the constructor: UrlShortenerTool(UrlManagerClient urlManager).

The URL Manager Client

The UrlManagerClient follows standard HttpClient patterns. It receives the pre-configured httpClient in its constructor and uses it to communicate with the API:

public class UrlManagerClient(HttpClient httpClient)
{
	public async Task<IQueryable<ShortUrlEntity>?> GetUrls()
    {
		IQueryable<ShortUrlEntity> urlList = null;
		try{
			using var response = await httpClient.GetAsync("/api/UrlList");
			if(response.IsSuccessStatusCode){
				var urls = await response.Content.ReadFromJsonAsync<ListResponse>();
				urlList = urls!.UrlList.AsQueryable<ShortUrlEntity>();
			}
		}
		catch(Exception ex){
			Console.WriteLine(ex.Message);
		}
        
		return urlList;
    }

	// other methods to manage short URLs
}

This separation of concerns keeps the code clean - tools handle the MCP interface, while the client handles the API communication.

Using the MCP Server with GitHub Copilot Chat

Now for the exciting part - connecting your MCP server to GitHub Copilot Chat! This is where you'll see your custom tools in action.

Configuring Copilot to Use Your MCP Server

Once the server is running (either deployed in Azure or locally), follow these steps:

  1. Open GitHub Copilot Chat in VS Code
  2. Change the mode to Agent by clicking the dropdown in the chat panel
  3. Click the Select Tools... button, then Add More Tools
Set GitHub Copilot mode to Agent

Selecting the Connection Type

GitHub Copilot supports several ways to connect to MCP servers:

All MCP Server types

There are multiple options available - you could have your server in a container or run it via command line. For our scenario, we'll use HTTP.

Note: At the time of writing this post, I needed to use the HTTP URL of the MCP server rather than HTTPS. You can get this URL from the Aspire dashboard by clicking on the resource and checking the available Endpoints.

After selecting your connection type, Copilot will display the configuration file, which you can modify anytime.

GitHub Copilot Chat Configuration

Interacting with Your Custom Tools

Now comes the fun part! You can interact with your MCP server in two ways:

  1. Natural language queries: Ask questions like "How many short URLs do I have?"
  2. Direct tool references: Use the pound sign to call specific tools: "With #azShortURL list all URLs"

The azShortURL is the name we gave to our MCP server in the configuration.

GitHub Copilot question and response example


Key Learnings and Future Directions

Building this MCP server for AzUrlShortener taught me several valuable lessons:

What Worked Well

  • Integration with .NET Aspire was remarkably straightforward
  • The attribute-based approach to defining tools is clean and intuitive
  • The separation of tool definitions from implementation logic keeps the code maintainable

Challenges and Considerations

  • The csharp-SDK is only a few weeks old and still in preview
  • OAuth authentication isn't defined yet (though it's being actively worked on)
  • Documentation is present but evolving rapidly as the technology matures, so some features may not be fully documented yet

For the AzUrlShortener project specifically, I'm keeping this MCP server implementation in the experimental branch mcp-server until I can properly secure it. However, I'm already envisioning numerous other scenarios where MCP servers could add great value.

If you're interested in exploring this technology, I encourage you to:

  • Check out the GitHub repo
  • Fork it and create your own MCP server
  • Experiment with different tools and capabilities

Join the Community

If you have questions or want to share your experiences with others, I invite you to join the Azure AI Community Discord server:

Join Azure AI Community Discord

The MCP ecosystem is growing rapidly, and it's an exciting time to be part of this community!


~Frank


Reading Notes #643

In this week Reading Notes, we explore a diverse range of updates and insights from the tech world. From the latest features in the Azure SDK and Developer CLI, to an introduction to .NET Aspire and its innovative approach to Infrastructure as Code, there's plenty to catch up on. 

Jump into discussions on AI productivity, free Azure SQL tiers, and even a refreshing podcast on stress-free living. 


Let's get started!


Cloud

Databases

AI

Programming

Podcasts


Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 


If you have interesting content, share it!

~Frank

Reading Notes #636

Welcome back to another edition of my Reading Notes! This week, we've got some great content lined up, including insights on cloud development with Azure Developer CLI, tips for promoting your open source projects, updates on .NET, and more. Dive in to explore a wealth of knowledge and stay updated with the latest trends and developments.

Cloud

Programming

AI

Miscellaneous


Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

 If you have interesting content, share

~Frank


Reading Notes #633

This edition of Readings Notes covers the synergies between C# and Rust, open-source software licensing challenges, and securing the software supply chain. It also feature a beginner’s guide to programming C# in Visual Studio Code. Finally, we delve into the evaluation process of AI models for GitHub Copilot. 

Enjoy the read!

man looking at the camera with a frozen lake behind him
Programming

Miscellaneous

  • How we evaluate AI models and LLMs for GitHub Copilot (Connor Adams, Klint Finley) - This is an interesting post that shares the methods used to test the different models before they can be included. Testing a model is different than testing a regular piece of code, what are they looking for, and what's important?
Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share 

~frank

Visual Countdown Days Until [a date]

During the holidays, I embarked on a fun project to create a visual countdown for important dates. Inspired by howmanysleeps and hometime from veebch, I wanted to build a countdown that didn't rely on Google Calendar. Instead, I used a Raspberry Pi Pico and some custom code to achieve this.

💾 You can find the full code on GitHub


Raspberry Pi pico and the light using custom colors

What It Is

This project consists of two main parts:

  • Python code for the Raspberry Pi Pico
  • A .NET website to update the configuration, allowing you to set:
    • The important date
    • Two custom colors or random ones
    • The RGB values for the custom colors


screenshot of the configuration website

What You Need

How to Deploy the Configuration Website

After cloning the repo, navigate to the src/NextEvent/ folder and use the Azure Developer CLI to initialize the project:

azd init

Enter a meaningful name for your resource group in Azure. To deploy, use the deployment command:

azd up

Specify the Azure subscription and location when prompted. After a few minutes, everything should be deployed. You can access the URL from the output in the terminal or retrieve it from the Azure Portal.

How to Set Up the Raspberry Pi Pico

Edit the config.py file to add your Wi-Fi information and update the number of lights on your light strip.

You can use Thonny to copy the Python code to the device. Copy both main.py and config.py to the Raspberry Pi Pico.

How It Works

  • The website creates a JSON file and saves it in a publicly accessible Azure storage.
  • When the Pi is powered on, it will:
    • Turn green one by one all the lights of the strip
    • Change the color of the entire light strip a few times, then turn it off
    • Try to connect to the Wi-Fi
    • Retrieve the timezone, current date, and settings from the JSON file
    • If the important date is within 24 days, the countdown will be displayed using random colors or the specified colors.
    • If the date has passed, the light strip will display a breathing effect with a random color of the day.

The Code on the Raspberry Pi Pico

The main code for the Raspberry Pi Pico is written in Python. Here's a brief overview of what it does:

  1. Connect to Wi-Fi: The connect_to_wifi function connects the Raspberry Pi Pico to the specified Wi-Fi network.
  2. Get Timezone and Local Time: The get_timezone and get_local_time functions fetch the current timezone and local time using online APIs.
  3. Fetch Light Settings: The get_light_settings function retrieves the important date and RGB colors from the JSON file stored in Azure.
  4. Calculate Sleeps Until Special Day: The sleeps_until_special_day function calculates the number of days until the important date.
  5. Control the LED Strip: The progress function controls the LED strip, displaying the countdown or a breathing effect based on the current date and settings.

The Configuration Website

The configuration website is built in C#. It's a Blazor server webapp, and I used .NET Aspire to make it easy to run it locally. The UI uses FluentUI-Blazor so it looks pretty, without effort. 

The website allows you to update the settings for the Raspberry Pi Pico. You can set the important date, choose custom colors, and save these settings to a JSON file in Azure storage.

Little Extra

The website is deployed in Azure Container App with a minimum scaling to zero to save on costs. This may cause a slight delay when loading the site for the first time, but it will work just fine and return to "dormant" mode after a while.

I hope you enjoyed reading about my holiday project! It was a fun and educational experience, and I look forward to working on more projects like this in the future.

What's Next?

Currently the project does a 24 days countdown (inspired from the advent calendar). I would like to add a feature to allow the user to set the number of days for the countdown. I would also like to add the possibility to set the color for the breathing effect (or keep it random) when the important date has passed. And lastly, I would like to add the time of the day when the light strip should turn on and off, because we all have different schedule 😉 .

Last thoughts

I really enjoyed doing this project. It was a fun way to learn more about the Raspberry Pi Pico, micro-Python (I didn't even know it was a thing), and FluentUI Blazor. I hope you enjoyed reading about it and that it inspired you to create your own fun projects. If you have any questions or suggestions, feel free to reach out, I'm fboucheros on most socials.

~Frank

Reading Notes #627


This week, I stumbled upon some fascinating reads. From the announcement of .NET 9 and its incredible versatility to an intriguing new type of failover for Azure Storage, there's plenty to explore. Discover how to get .NET 9 running on your Raspberry Pi, check out the latest Blazorise update, and delve into the power of GitHub Models in .NET with Semantic Kernel. Plus, don't miss out on the introduction of GitHub Copilot for Azure and a new season of AI-related sessions in Visual Studio. And for my fellow open-source enthusiasts, the .NET Aspire Community Toolkit is a game-changer. 

Dive in and let's geek out together! 🌟

Suggestion of the week

  • Announcing .NET 9 - .NET Blog (.NET Team) - You can build anything with C# (aka .NET) and I love it! With runs everywhere, it's open source, it's fast and free!

Cloud

Programming

  • Install and use Microsoft Dot NET 9 with the Raspberry Pi (Pete Codes) - C# everywhere! I love it! I do have some code that run on a Pi as a mini server, bubi need to have a look for a IoT library that could be used.

  • Blazorise v1.7 (Mladen Macanović) - New version of a nice looking CSS Framework for our Blazor website with more features and better performance.

AI

Data

Open Source

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.
If you have interesting content, share it!


~ Frank


Reading Notes #626

Kick off your week with this curated list of must-read tech articles. From .NET modernisation patterns and new C# 13 LINQ methods to open-source contributions and thought-provoking reads, there's plenty to explore. 


Enjoy!

Programming

Open-Source

Miscellaneous

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share it!


~ Frank

Reading Notes #625

Welcome to another edition of Reading Notes! This week, dive into the latest updates on Azure DevOps, Docker best practices, System.Text.Json enhancements in .NET 9, AI innovations from GitHub Universe, and more. 

Enjoy your reading!

Cloud

Programming

AI

  • GitHub Spark (Devon Rifkin, Terkel Gjervig Nielsen, Cole Bemis, Alice Li) - Fascinating news from GitHub Universe. A new spin on the lowcode app but with code. Looking forward to trying it and see what I can build with it.

  • GitHub Copilot in Windows Terminal (Christopher Nguyen) - There it is, Copilot making his entrance into our beloved Terminal. It's only in version Canary for now, but I'm sure it will help many of us when no sure what command to use, or the equivalent bash/ PowerShell.

Miscellaneous


Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share it!


~ Frank


Reading Notes #623

Ready for another round of intriguing reads and insightful listens? This week's edition of Reading Notes dives into the seamless blending of Python and .NET, fresh monetisation strategies in open source, AI innovations, and thought-provoking podcast discussions. 

Get comfy and let's dive in!

path in a forest surround by large trees



Programming

AI

Podcast

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share it!

~ Frank

Reading Notes #617

Less posts and articles this week, I was devouring books! I guess a special "book edition" of the Reading Notes should be coming soon... 

If you are new on this blog, welcome! The reading notes post is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

 
You also read something you liked? Share it!

Cloud

Programming

Podcasts

  • Debug Containers with Mintoolkit (DevOps and Docker Talk: Cloud Native Interviews and Tooling) - Great episode about the intricacy of debugging containers and a tool mintoolkit,before knows as DockerSlim, that could really simplify things for us.

  • Fine tuning Products with Stanza System's Stacie Frederick (Hanselminutes with Scott Hanselman) - Do we really need to have that automatic build and deployment x time by week, or day? Are far should we go? Are we trying to solve a real problem or just find a solution we think we need. Great episode that will help you answering those questions.

  • Better ways to have tricky conversations at work (Modern Mentor) - It's been many stories I heard from people who quit their job before even talking to their manager... This short episode will provides you a few tips to get more prepare and try different alternative.

~Frank

Reading Notes #608

It's reading notes time! It is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 


You also read something you liked? Share it!

Cloud

Programming

AI

Podcast

Miscellaneous


~ Frank


Reading Notes #602

It's reading notes time! It is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.


Having interesting content? Share it!

Cloud

Programming

AI

~Frank

Reading Notes #601

It's reading notes time! It is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

Having interesting content? Share it! 


Suggestion of the week

  • Announcing: Azure Developers (Mehul Harry) - Looking forward to this event. I have the pleasure to present a session with Jerry Nixon about Data API Builder. Join us!

Cloud

  • Demystifying Azure CLI pagnination (Jeremy Li) - That's great! It's so sad when all the information is "throw" on us without any control and it's on us to find our "needle" we are looking for in those screens full of line. This will definitely helps.

Programming

Open Source

~ Frank

Reading Notes #598

It's reading notes time! It is a habit I started a long time ago, close to 600 weeks ago in fact, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you think you may have interesting content, share it!

Cloud

Programming

DevOps

Open Source

AI


~Frank

Reading Notes #597

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Suggestion of the week

Cloud

Programming

Open Source

Miscellaneous

~frank

Reading Notes #589

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

  • Understanding C# 8 default interface methods (Andrew Lock) - Very clear post about the new feature available in interfaces, with great examples that make us understand why and when it is useful and how to implement it.

Open Source

Podcasts

~Frank

Reading Notes #588

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Programming

~Frank

Database to go! The perfect database for developer

When building a new project, we don't need a big database that scales and has lots of data, but we do still need some kind of data source. Of course, it is possible to fake it and have some hardcoded value returned by an API but that takes time to create and it's not a database. In this post, I want to share a solution to have a portable, self-healing, disposable, disconnected database that doesn't require any installation.

The solution? Put the database in a container! It doesn't matter what database you are planning to use or on which OS you are developing. Most databases will have an official image available on Docker Hub and Docker runs on all platforms. If you feel uncomfortable with containers, have no fear, this post is beginner-friendly.

This post is part of a series where I share my experience while building a Dungeon crawler game. The code can be found on GitHub.


The Plan

Have a database ready at the "press of a button". By "ready", I mean up and running, with data in it, and accessible to all developer tools.

Preparation for the Database

We need a script to create the database schema and some data. There are many ways to achieve this. A beginner-friendly way is to create an empty database and use a tool like Azure Data Studio to help create the SQL scripts. Doing it this way will validate that the script works.

The Docker command to create the database's container will change a little depending on the database you are using but here what's a MySQL one look like:

docker run --name some-mysql -e MYSQL_ROOT_PASSWORD='rootPassword' -p 3306:3306 -d mysql 

Where some-mysql is the name you want to assign to your container, rootPassword is the password to be set for the MySQL root user and -d means that the container will run detached. The -p option is used to map the port 3306 of the container to the port 3306 of the host. This is required to be able to connect to the database from the host.

output of the docker run command


Now, a MySQL server is running inside the container. To connect to the server with Azure Data Studio use the extension MySQL extension for Azure Data Studio. Microsoft has a QuickStart: Use Azure Data Studio to connect and query MySQL if needed. Create a new connection in Azure Data Studio, then create a database (ex: 2d6db).

Create a new connection in Azure Data Studio

You can use the MySQL command-line tool if you prefer, but Azure Data Studio offers a lot of help when you are not that familiar with SQL. You can even use the Copilot extension and ask it to write the SQL statement for you. It's pretty good!

If you want to learn more about this, check the Open at Microsoft episode: Copilot is now in Azure Data Studio and this is how it can help you! to see it in action.

It's fantastic to generate a first draft of the create statements and to make queries.

Copilot writing SQL

Let's create two SQL scripts. The first one will be to create the schema with all the tables. The idea here is to write the script and execute it to validate the results. Here is an example creating only one table to keep the post simple.

-- schema.sql

CREATE TABLE IF NOT EXISTS 2d6db.rooms (
  id int NOT NULL AUTO_INCREMENT,
  roll int DEFAULT 0,
  level int DEFAULT 1,
  size varchar(10) DEFAULT NULL,
  room_type varchar(255) DEFAULT NULL,
  description varchar(255) DEFAULT NULL,
  encounter varchar(255) DEFAULT NULL,
  exits varchar(255) DEFAULT NULL,
  is_unique bool DEFAULT false,
  PRIMARY KEY (id)
);

Now that there are tables in the database, let's fill them with seed data. For this, the second SQL script will contain insert statement to populate the tables. We don't need all the data but only what will be useful when developing. Think about creating data to cover all types or scenarios, it's a development database so it should contain data to help you code.

-- data.sql

INSERT INTO 2d6db.rooms(roll, level, room_type, size, description, exits, is_unique)
VALUES (2,1,'Empty space', 'small','There is nothing in this small space', 'Archways',false);

INSERT INTO 2d6db.rooms(roll, level, room_type, size, description, exits, is_unique)
VALUES (3,1,'Strange Text', 'small','This narrow room connects the corridors and has no furniture. On the wall though...', 'Archways',false);

INSERT INTO 2d6db.rooms(roll, level, room_type, size, description, exits, is_unique)
VALUES (4,1,'Grakada Mural', 'small','There is a large mural of Grakada here. Her old faces smiles...', 'Archways',true);

Note: You can now stop the database's container with the command: docker stop some-mysql. We don't need it anymore.

Putting All the Pieces Together

This is when the magic starts to show up. Using a Docker Compose file, we will start the database container and execute the two SQL scripts to create and populate the database.

# docker-compose.yml

services:
  
  2d6server:
    image: mysql
    command: --default-authentication-plugin=mysql_native_password
    environment:
      MYSQL_DATABASE: '2d6db'
      MYSQL_ROOT_PASSWORD: rootPassword
    ports:
       - "3306:3306"
    volumes:
      - "../database/scripts/schema.sql:/docker-entrypoint-initdb.d/1.sql"
      - "../database/scripts/data.sql:/docker-entrypoint-initdb.d/2.sql"

The docker-compose.yml file are in YAML and usually are used to start multiple containers at once, but it doesn't need to. In this scenario, the file defines a single container named 2d6server using just like the previous Docker command and MySQL image and configuration. The last command volumes is new. It maps the path where the SQL scripts are located to /docker-entrypoint-initdb.d inside the container. When MySQL starts it will execute the files in that specific folder in alphabetic order. This is why the scripts are renamed 1.sql and 2.sql, as the table must be created first.

Do get the database up and ready, we will execute the docker compose up.


# start the database
docker compose -f /path_to_your_file/docker-compose.yml up -d 

# stop the database
docker compose -f /path_to_your_file/docker-compose.yml down -d 

By default, the command looks for a docker-compose.yml file. If you have a different name use the argument -f to specify the filename. Optionally, to free the prompt you can pass the argument -d to be in detached mode.

Docker Compose commands

When you are done coding and you don't need the database anymore, execute the docker compose down command to free up your computer. Compared to when the server is installed locally, a container will leave no trace; your computer is not "polluted".

When you need to update the database, edit the SQL script first. When the scripts are ready, execute the docker-compose restart to get the database refreshed.

To Conclude

Now, you only need to execute one simple command get a fresh database, when you want. All the developers don't need to have a database server installed and configured locally. And you don't need to be worried when deleting or modifying data, like when using a shared database. After cloning the repository all developers will have everything they need to start coding.

In a next post, I will share how I used Azure Data API Builder to generate a complete API on top of the database using the same docker compose method.

Video version!

If you prefer watching instead of reading here the video version of this post!

Reading Notes #577

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 


 If you think you may have interesting content, share it!

 

Cloud

Programming

Low Code

~Frank

Reading Notes #576

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.


If you think you may have interesting content, share it!


Suggestion of the week

  • Visual Studio Code: C# Dev Kit Now Generally Available (Almir Vuk) - As a .NET developer that spend a lot of time in Linux, I was already using a lot VSCode and started using DevKit curious to see if if would really helps. I felt empowered. Great work, and congrats on the GA that was fast!

Programming


Podcasts

  • What do we want from a web browser? (The Changelog: Software Development, Open Source) - I think it was my first episode of The Changelog. I liked it. Interesting discussion about what should be a "good" web browser...

~Frank