Showing posts with label database. Show all posts
Showing posts with label database. Show all posts

Reading Notes #611

Welcome to this week’s edition of Reading Notes! In this roundup, we explore a variety of topics across cloud, programming, databases, and AI. From understanding Docker’s USER instruction to styling Blazor components with CSS, I’ve got you covered. Let’s dive in!


Suggestion of the week

  • Understanding the Docker USER Instruction (Jay Schmidt) - A great post to that explains really clearly the basic usage of user when building our container. After reading this post you should feel confident to follow this best practices.

Cloud

Programming

Databases

AI

~frank

Reading Notes #607

It's reading notes time! It is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

You also read something you liked? Share it!

Cloud

Programming

Databases

Podcasts


~frank


It turns out, it's not difficult to remove all passwords from our Docker Compose files

I used to hardcode my password in my demos and code samples. I know it's not a good practice, but it's just for demo purposes, it cannot be that dramatic, right? I know there are proper ways to manage sensitive information, but this is only temporary! And it must be complicated to remove all the passwords from a deployment... It turns out, IT IS NOT difficult at all, and that will prevent serious threats.

In this post, I will share how to remove all passwords from a docker-compose file using environment variables. It's quick to setup and easy to remember. For production deployment, it's better to use secrets, because environment variables will be visible in logs. That said, for demos and debugging and testing, it's nice to see those values. The code will be available on GitHub. This deployment was used for my talks during Azure Developers .NET Days: Auto-Generate and Host Data API Builder on Azure Static Web Apps and The most minimal API code of all... none

The Before Picture

For this deployment, I used a docker-compose file to deploy an SQL Server in a first container and Data API Builder (DAB) in a second one. When the database container starts, I run a script to create the database tables and populate them.

services:

  dab:
    image: "mcr.microsoft.com/azure-databases/data-api-builder:latest"
    container_name: trekapi
    restart: on-failure
    volumes:
      - "./startrek.json:/App/dab-config.json"
    ports:
      - "5000:5000"
    depends_on:
      - sqlDatabase

  sqlDatabase:
    image: mcr.microsoft.com/mssql/server
    container_name: trekdb
    hostname: sqltrek
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: "1rootP@ssword"
    ports:
      - "1433:1433"
    volumes:
      - ./startrek.sql:/startrek.sql
    entrypoint:
      - /bin/bash
      - -c
      - |
        /opt/mssql/bin/sqlservr & sleep 30
        /opt/mssql-tools/bin/sqlcmd -U sa -P "1rootP@ssword" -d master -i /startrek.sql
        sleep infinity

As we can see, the password is in clear text twice, in the configuration of the database container and in the parameter for sqlcmd when populating the database. Same thing for the DAB configuration file. Here the data-source node where the password is in clear text in the connection string.

"data-source": {
 	"database-type": "mssql",
	"connection-string": "Server=localhost;Database=trek;User ID=sa;Password=myPassword!;",
	"options": {
		"set-session-context": false
	}
}

First Pass: Environment Variables

The easiest password instance to remove was in the sqlcmd command. When defining the container, an environment variable was used... Why not use it! To refer to an environment variable in a docker-compose file, you use the syntax $$VAR_NAME. I used the name of the environment variable MSSQL_SA_PASSWORD to replace the hardcoded password.

/opt/mssql-tools/bin/sqlcmd -U sa -P $$MSSQL_SA_PASSWORD -d master -i /startrek.sql

Second Pass: .env File

That's great but the value is still hardcoded when we assign the environment variable. Here comes the environment file. They are text files that holds the values in key-value paired style. The file is not committed to the repository, and it's used to store sensitive information. The file is read by the docker-compose and the values are injected. Here is the final docker-compose file:

services:

  dab:
    image: "mcr.microsoft.com/azure-databases/data-api-builder:latest"
    container_name: trekapi
    restart: on-failure
    env_file:
      - .env
    environment:
      MY_CONN_STRING: "Server=host.docker.internal;Initial Catalog=trek;User ID=sa;Password=${SA_PWD};TrustServerCertificate=True"
    volumes:
      - "./startrek.json:/App/dab-config.json"
    ports:
      - "5000:5000"
    depends_on:
      - sqlDatabase

  sqlDatabase:
    image: mcr.microsoft.com/mssql/server
    container_name: trekdb
    hostname: sqltrek
    environment:
      ACCEPT_EULA: "Y"
      MSSQL_SA_PASSWORD: ${SA_PWD}
    env_file:
      - .env
    ports:
      - "1433:1433"
    volumes:
      - ./startrek.sql:/startrek.sql
    entrypoint:
      - /bin/bash
      - -c
      - |
        /opt/mssql/bin/sqlservr & sleep 30
        /opt/mssql-tools/bin/sqlcmd -U sa -P $$MSSQL_SA_PASSWORD -d master -i /startrek.sql
        sleep infinity

Note the env_file directive in the services definition. The file .env is the name of the file used. The ${SA_PWD} tells docker compose to look for SA_PWD in the .env file. Here is what the file looks like:

SA_PWD=This!s@very$trongP@ssw0rd

Conclusion

Simple and quick. There are no reasons to still have the password in clear text in the docker compose files anymore. Even for a quick demo! Of course for a production deployment there are stronger ways to manage sensitive information, but for a demo it's perfect and it's secure.

During Microsoft Build Keynote on day 2, Julia Liuson and John Lambert talked about how trade actors are not only looking for the big fishes, but also looking at simple demos and old pieces of code, looking for passwords, keys and sensitive information.

Reading Notes #595

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

 
If you think you may have interesting content, share it!

Programming

AI

Databases

Miscellaneous

~Frank

Reading Notes #594

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you think you may have interesting content, share it!

Cloud

Programming

AI

Databases

~Frank

Database to go! The perfect database for developer

When building a new project, we don't need a big database that scales and has lots of data, but we do still need some kind of data source. Of course, it is possible to fake it and have some hardcoded value returned by an API but that takes time to create and it's not a database. In this post, I want to share a solution to have a portable, self-healing, disposable, disconnected database that doesn't require any installation.

The solution? Put the database in a container! It doesn't matter what database you are planning to use or on which OS you are developing. Most databases will have an official image available on Docker Hub and Docker runs on all platforms. If you feel uncomfortable with containers, have no fear, this post is beginner-friendly.

This post is part of a series where I share my experience while building a Dungeon crawler game. The code can be found on GitHub.


The Plan

Have a database ready at the "press of a button". By "ready", I mean up and running, with data in it, and accessible to all developer tools.

Preparation for the Database

We need a script to create the database schema and some data. There are many ways to achieve this. A beginner-friendly way is to create an empty database and use a tool like Azure Data Studio to help create the SQL scripts. Doing it this way will validate that the script works.

The Docker command to create the database's container will change a little depending on the database you are using but here what's a MySQL one look like:

docker run --name some-mysql -e MYSQL_ROOT_PASSWORD='rootPassword' -p 3306:3306 -d mysql 

Where some-mysql is the name you want to assign to your container, rootPassword is the password to be set for the MySQL root user and -d means that the container will run detached. The -p option is used to map the port 3306 of the container to the port 3306 of the host. This is required to be able to connect to the database from the host.

output of the docker run command


Now, a MySQL server is running inside the container. To connect to the server with Azure Data Studio use the extension MySQL extension for Azure Data Studio. Microsoft has a QuickStart: Use Azure Data Studio to connect and query MySQL if needed. Create a new connection in Azure Data Studio, then create a database (ex: 2d6db).

Create a new connection in Azure Data Studio

You can use the MySQL command-line tool if you prefer, but Azure Data Studio offers a lot of help when you are not that familiar with SQL. You can even use the Copilot extension and ask it to write the SQL statement for you. It's pretty good!

If you want to learn more about this, check the Open at Microsoft episode: Copilot is now in Azure Data Studio and this is how it can help you! to see it in action.

It's fantastic to generate a first draft of the create statements and to make queries.

Copilot writing SQL

Let's create two SQL scripts. The first one will be to create the schema with all the tables. The idea here is to write the script and execute it to validate the results. Here is an example creating only one table to keep the post simple.

-- schema.sql

CREATE TABLE IF NOT EXISTS 2d6db.rooms (
  id int NOT NULL AUTO_INCREMENT,
  roll int DEFAULT 0,
  level int DEFAULT 1,
  size varchar(10) DEFAULT NULL,
  room_type varchar(255) DEFAULT NULL,
  description varchar(255) DEFAULT NULL,
  encounter varchar(255) DEFAULT NULL,
  exits varchar(255) DEFAULT NULL,
  is_unique bool DEFAULT false,
  PRIMARY KEY (id)
);

Now that there are tables in the database, let's fill them with seed data. For this, the second SQL script will contain insert statement to populate the tables. We don't need all the data but only what will be useful when developing. Think about creating data to cover all types or scenarios, it's a development database so it should contain data to help you code.

-- data.sql

INSERT INTO 2d6db.rooms(roll, level, room_type, size, description, exits, is_unique)
VALUES (2,1,'Empty space', 'small','There is nothing in this small space', 'Archways',false);

INSERT INTO 2d6db.rooms(roll, level, room_type, size, description, exits, is_unique)
VALUES (3,1,'Strange Text', 'small','This narrow room connects the corridors and has no furniture. On the wall though...', 'Archways',false);

INSERT INTO 2d6db.rooms(roll, level, room_type, size, description, exits, is_unique)
VALUES (4,1,'Grakada Mural', 'small','There is a large mural of Grakada here. Her old faces smiles...', 'Archways',true);

Note: You can now stop the database's container with the command: docker stop some-mysql. We don't need it anymore.

Putting All the Pieces Together

This is when the magic starts to show up. Using a Docker Compose file, we will start the database container and execute the two SQL scripts to create and populate the database.

# docker-compose.yml

services:
  
  2d6server:
    image: mysql
    command: --default-authentication-plugin=mysql_native_password
    environment:
      MYSQL_DATABASE: '2d6db'
      MYSQL_ROOT_PASSWORD: rootPassword
    ports:
       - "3306:3306"
    volumes:
      - "../database/scripts/schema.sql:/docker-entrypoint-initdb.d/1.sql"
      - "../database/scripts/data.sql:/docker-entrypoint-initdb.d/2.sql"

The docker-compose.yml file are in YAML and usually are used to start multiple containers at once, but it doesn't need to. In this scenario, the file defines a single container named 2d6server using just like the previous Docker command and MySQL image and configuration. The last command volumes is new. It maps the path where the SQL scripts are located to /docker-entrypoint-initdb.d inside the container. When MySQL starts it will execute the files in that specific folder in alphabetic order. This is why the scripts are renamed 1.sql and 2.sql, as the table must be created first.

Do get the database up and ready, we will execute the docker compose up.


# start the database
docker compose -f /path_to_your_file/docker-compose.yml up -d 

# stop the database
docker compose -f /path_to_your_file/docker-compose.yml down -d 

By default, the command looks for a docker-compose.yml file. If you have a different name use the argument -f to specify the filename. Optionally, to free the prompt you can pass the argument -d to be in detached mode.

Docker Compose commands

When you are done coding and you don't need the database anymore, execute the docker compose down command to free up your computer. Compared to when the server is installed locally, a container will leave no trace; your computer is not "polluted".

When you need to update the database, edit the SQL script first. When the scripts are ready, execute the docker-compose restart to get the database refreshed.

To Conclude

Now, you only need to execute one simple command get a fresh database, when you want. All the developers don't need to have a database server installed and configured locally. And you don't need to be worried when deleting or modifying data, like when using a shared database. After cloning the repository all developers will have everything they need to start coding.

In a next post, I will share how I used Azure Data API Builder to generate a complete API on top of the database using the same docker compose method.

Video version!

If you prefer watching instead of reading here the video version of this post!

Reading Notes #558

It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you have interesting content, share it!

Databases

Programming

LowCode

Podcasts

  • Remote Versus Local Development with Mike Brevoort (Screaming in the Cloud) - Remote or not remote, that's the question... right? Interesting discussion around a polarized question.
  • DevPod for Dev Containers (DevOps and Docker Talk) - I heard of DevPod before, but it wasn't stable enough for my occasional usage. But now it looks just perfect! Looking forward to trying it!
  • Blazor Web Assembly by Example with Toi B. Wright (Hanselminutes with Scott Hanselman) - You can hear the passion in her voice! I'm sure the book is excellent, plus I love the idea of picking only the part that you want/ need and not missing anything.
~Frank


Reading Notes #555


It is time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, and books that catch my interest during the week. 

If you think you may have interesting content, share it!

 

The suggestion of the week

Programming

Databases

Podcasts

Miscellaneous

~frank

Reading Notes #548

The suggestion of the week

  • GitHub Copilot X: The AI-powered developer experience (Thomas Dohmke) - Wow! Very excited about this news. Copilot is way more than just a buzz. After using it for a while I really appreciate the suggestions it was making. Looking forward to trying that next release.

Cloud

Programming

Miscellaneous

~frank

Reading Notes #540

Good Monday!

Already time to share new reading notes. It is a habit I started a long time ago where I share a list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week.

You think you may have interesting content, share it!

Cloud

Programming

Databases

Miscellaneous

Reading Notes #537

Screen capture during Windows 11 installation

Good Monday, it's time to share new ReadingNotes. Here is a list of all the articles, podcasts, and blog posts, that catch my interest during the week.

If you think you may have interesting content, share it!


The suggestion of the week

Cloud

  • Azure Storage Caching and Compression – The Missing Piece (Jeffrey T. Fritz) - This post shares all the gotchas learned along the way in developing and trying to optimize an application. The result is a faster application, a better experience for the client, and more money that stays in your pocket.

Programming

Databases

Miscellaneous



~Frank


Reading Notes #533


Good Monday!
It's time to share my reading notes. Those are a curated list of all the articles and blog posts, that caught my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed.

If you think you may have interesting content, share it!

Cloud

Programming

Miscellaneous

~Enjoy!


Reading Notes #529


Good Monday (standard time for many :) ),
It's time to share new ReadingNotes. Here is a list of all the articles, podcasts, and blog posts, that catch my interest during the week. 

If you think you may have interesting content, share it!

Cloud

Programming

Miscellaneous


~frank

Reading Notes #451

Cloud


Programming


Miscellaneous


Podcast

Books


A Game Plan for Life: The Power of Mentoring

Author: John Wooden, John C. Maxwell

What a great book. I didn't know John Wooden before, but I am very impressed and inspired by both coach Wooden and the author. There is so much in this book and worth reading it again... But first, let's get prepare :)

Reading Notes #434

Every Monday, I share my "reading notes". Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting. It's a mix of the actuality and what I consumed. 

You think you may have interesting content, share it!

Cloud


Programming


Podcasts


Miscellaneous


~☁️


Reading Notes #427

Every Monday, I share my "reading notes". Those are a curated list of all the articles, blog posts, podcast episodes, and books that catch my interest during the week and that I found interesting.

It's a mix of the actuality and what I consumed. As you will quickly see This week is heavily tinted by the Microsoft Build event. I still have tons of videos to watch since I was was busy monitoring the chat during most of the performances.

You think you may have interesting content, share it!

Cloud


Programming


Podcast

  • Chocolatey with Rob Reynolds (.NET Rocks!) - Awesome episode talking about all the good and the great of Chocolatey that awesome package manager.

Miscellaneous


~enjoy!

Reading Notes #384

Programming

  • Install WSL 2 on Windows 10 (Thomas Maurer) - Awesome tutorial. If like me you didn't want to wait until the next Windows release or take the time to compile and debug a deployment....this tutorial is for us!

Databases


Miscellaneous

~

Reading Notes #363


Cloud




Programming




Databases




Miscellaneous


~


Reading Notes #356

IMG_20181128_122246Suggestion of the week

  • Security Headers (Tanya Janca) - Interesting post that shows the code/configuration we need to add, in order to get a more secure website.

Cloud


Programming


Miscellaneous


Books

fast_focus_coverFast Focus: A Quick-Start Guide To Mastering Your Attention, Ignoring Distractions, And Getting More Done In Less Time! (Damon Zahariades) - Great book well organized. Simple strike to the point. It is divided into three parts: understanding focus, creating an environment for focus, and employing tactics to focus. It lists the top 10 obstacles to staying focused and gives you a great idea on how to get start your journey.











~


Reading Notes #351

MVIMG_20181111_190706

Cloud


Programming


Data


~