Docker Compose
So far we have worked with a single container. Running your local application isolated and with a few commands was "feasible".
The reality is that 90% of software systems on planet earth and beyond do not consist of just 1 part. They consist of a Frontend (React/Next), a Backend (Node/Python) and one or many Databases (Postgres/Redis).
Would you have to open 3 terminals and type:
docker run -d -p 3000:3000 --name web front +
docker run -d -p 8080:8080 --name api back +
docker run -d -p 5432:5432 -e POSTGRES_USER=admin --name db postgres... every time you start your day?
And if the three intend to communicate with each other... their network IPs will differ. This is where Docker Compose shines.
Docker Compose is an official tool that comes with Docker to locally orchestrate multiple containers as if they were a happy and united family communicating with each other by internal name instead of tangled IPs.
The docker-compose.yml File
It is based on a file in YAML format (very similar to a JSON dictation) that regularly sits at the root of the project to define the services (containers) that will bring up the entire stellar system.
Very simple example to start your API tied to a Database in a couple of seconds:
# 1. We define the general syntax version that Docker will read (3 is the modern standard)
version: '3.8'
# 2. We begin to dictate all our containers
services:
# A) Container 1: Our database.
# (We don't need to create a Dockerfile; it simply downloads the image from the internet)
database:
image: postgres:14
ports:
- "5432:5432"
environment: # The configuration variables for this database to spin up
POSTGRES_USER: admin
POSTGRES_PASSWORD: super_password
# B) Container 2: Our Backend (API developed by us)
backend:
build: . # This tells it to look for the 'Dockerfile' in our own folder to build this block, as there isn't an original out-of-the-box one in the cloud for it.
ports:
- "8080:8080"
depends_on:
- database # THIS IS THE MAGIC: We force the Backend not to turn on UNTIL the database above is up and ready to run. It will route traffic automatically ensuring its stellar and immediate synchrony.
environment:
# When my backend needs to internally call the Docker database, where I would put "localhost:5432" I explicitly put its own 'service:'
DB_HOST: database
Raising "The Orchestra"
Instead of fighting by manually typing multiples of docker run in different windows:
Open a single official terminal where this brilliant and revolutionary YAML lives and execute:
# Brings up the ENTIRE environment at once.
# Adding a '-d' after 'up' will bring it all up in the background (Detached).
docker-compose up -d
That's it. Literally in seconds the official Postgres database will drop in, your machine will swiftly and relentlessly begin to build your BackEnd image under your own Dockerfile, it will create a secret local and private internal network between the Node Engine and PostgreSQL to communicate as one, and turn everything on in rhythm exposing the desired ports in your browser under the Localhost brand. Voilá!. 🚀
And at the end of the arduous day, shut yourself down and destroy both with identical technological minimalism:
docker-compose down