
Flexible Development Environment Using Docker
Published on April 22, 2024For us software developers, especially in enterprise setup, it’s crucial to create a dedicated development environment to manage multiple projects with different dependencies as easy and efficient as possible.
In this post, I want to share how I setup my development environment from scratch, and in every Operating System I use.
Operating System
Everything starts on which Operating System are we using. In my opinion, Ubuntu is the best operating system for software development. But since I also do gaming, my personal machine uses Windows, and I just use Windows Subsystem for Linux (WSL) for my development projects.
If you are using WSL, I prefer to use the Debian
distro instead of the
default Ubuntu
distro, because the Debian
distro doesn’t have any
pre-installed packages, so you’re basically starting from scratch, which is
what we want.
Basic Ubuntu Packages
Now that we have an OS to work with, let’s just install some basic packages we need in development.
Open up your terminal and update our package manager.
# Make sure to get the latest APT packages availablesudo apt update;
# Upgrade all installed packages in your systemsudo apt upgrade -y;
Then install our packages.
sudo apt install -y git zsh curl wget unzip make
What did we install?
git
: For version control and managing repositories.zsh
: I preferzsh
over Ubuntu’s defaultbash
shell.curl
andwget
: For fetching remote files/endpoints. Most libraries out there uses them for installation.unzip
: For extractingzip
files.make
: Some libraries usesmake
for installation. We can also use it to simplify workflow by creating aMakefile
for some projects.
Terminal
Default Login Shell
Since we’ve installed zsh
, let’s use it for our login shell.
chsh -s /usr/bin/zsh
This command will change the default shell from /usr/bin/bash
to /usr/bin/zsh
. It will ask for your password so just type it there.
After changing the login shell, restart your terminal completely to apply the changes.
Warp
I’ve recently discovered this terminal emulator on the internet, and it looks very good. Warp is a terminal application that has a lot of features that makes our life way easier. It has a smart autocompletion, IDE-like keybindings, very intuitive and customizable UI, heck, it even has integration with AI if you want to ask something about commands or anything. Its latest major release includes Linux support, though Windows support is still in development (there is a workaround where you can run Warp in WSL, though it’s not officially stable: Link Here).
Installation
- Go to the Warp website.
- Download the
.deb
file. - Copy the filename of the downloaded file (including
.deb
). - Open up your terminal and install the app using
apt
:
sudo apt install -y ~/Downloads/<filename>.deb
While Warp is a great terminal app to use, its main caveat for most users is that you need to sign up for an account in order to use it. Which is quite weird for just a terminal application. But if you don’t mind it like me, then we’re good to go!
After successful installation and sign up, you can now configure your Warp depending on what works the best for you. But for me, here are some settings I have that want to share:
- Features > Editor
- Open completions menu as you type =
true
- Expand aliases as you type =
true
- Show input hint text =
true
- Tab key behaviour =
Accept Autosuggestion
- I like to use
tab
to accept autocomplete, and Warp now assignsctrl + space
to open completion menu. Which is like using Copilot in VSCode. I like that.
- I like to use
- Open completions menu as you type =
Tools
Next, we can install tools we need for development projects. Tools like Node.js
, Bun
, PHP
, Rust
or any other command line tools you may need for specific projects. You can just search the internet on how to install such tools in your OS. I’m gonna leave that task to you!
Even if we’re gonna use Docker for our dev environment, I think it’s fine to install these tools in our machine for generating a project, for example.
Code Editor / IDE
I don’t wanna make a debate on what Code Editor or IDE is the best, let’s just use the most accessible and easier to use, Microsoft VSCode. Since we’re on Ubuntu, we have a lot of ways to install VSCode on our machine, either using snap
, flatpak
, etc. But for this one, I think we should just directly download the .deb
package from the VSCode website itself.
- Open your browser and go to the VSCode website.
- Download the
.deb
file. - Copy the filename of the downloaded file (including
.deb
). - Open up your terminal and install the app using
apt
:
sudo apt install -y ~/Downloads/<filename.deb>
Docker
I know, I know. At first, Docker is very overwhelming to use. But just bear with me, because I’m gonna walk you through all commands you need to know and how to use Docker on your projects. It’s also a great skillset to have as a software developer.
Why Docker?
There are a lot of blog posts and YouTube videos on why and how Docker solves a lot of common problems in software development. But in my opinion, the main reason I like to use Docker is that I can just easily run all my projects at the same time without interrupting my host machine, especially if my projects have different dependencies. For example, I have these projects:
- a backend written in Node.js v21 with MySQL 5.7,
- an API written in PHP 8.1,
- a React app running in Node.js v18,
- a Laravel app running in PHP 8.3 with MySQL 8.0, with a frontend running in Node.js v20.
You can easily notice the differences in PHP and Node.js versions for every project. If I just install every dependencies in my host machine, man, it’s a pain-in-the-ass managing different versions, and most of the time it will have a conflict on something like ports. Docker makes it easy to isolate projects with their corresponding dependencies ready to spin up everytime.
Let’s start with installing Docker Engine.
curl -fsSL https://get.docker.com -o get-docker.shsudo sh get-docker.sh
As much as I love Docker Desktop, sometimes there are some weirdness on the Docker Engine included in it, especially in Ubuntu. So if you are in Ubuntu, I recommend to just install the Docker Engine with the CLI to interact with it.
After the commands succeed, you can sanity-check if Docker is really installed.
# Check docker versiondocker version
# Check docker compose versiondocker compose version
If both commands doesn’t have any errors, congratulations! You just installed Docker! But there’s one more crucial step we need to do so we can use Docker even without root privileges (without prefixing docker commands with sudo
).
sudo usermod -aG docker $USER
This command will add your user to the docker
group, and we need to
restart our system for the changes to apply.
After restarting our system, we should be able to use Docker without root privileges. Let’s test it out!
docker run hello-world
This command will pull the hello-world
image from Docker Hub and try to run it. This should have an output of something like:
Hello from Docker!This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps: 1. The Docker client contacted the Docker daemon. 2. The Docker daemon pulled the "hello-world" image from the Docker Hub. (amd64) 3. The Docker daemon created a new container from that image which runs the executable that produces the output you are currently reading. 4. The Docker daemon streamed that output to the Docker client, which sent it to your terminal.
To try something more ambitious, you can run an Ubuntu container with: $ docker run -it ubuntu bash
Share images, automate workflows, and more with a free Docker ID: https://hub.docker.com/
For more examples and ideas, visit: https://docs.docker.com/get-started/
Nice! We now have Docker working in our system. Onto the next step!
Global Docker Services
Now, installing Docker won’t instantly spin up services we need for our development environment, it’s just a tool for us to easily run multiple services using compose.yaml
files. So we need to have a global compose file to have services we’ll use on every project we have.
What global services do we need?
Well, in my dev environment setup, I like to have these services globally:
Traefik
An easy-to-use reverse proxy for our services. I like to have one especially if I have to manage different projects and give it a unique domain I can use to access the app. I don’t even need to think of a unique port to expose for each project (It’s tiring bruh).
DbGate
This is actually optional. You can use whatever database client you’re most used to. But DbGate is just the best I’ve tried so far. It’s free, fully-featured, supports all types of databases (SQL, NoSQL), and the best part, we don’t even need to install it as a standalone application in our system! It has an official Docker Image that we can use to run the service locally on the web.
Let’s install them all!
Before we create our global services configuration, let’s first create a user-defined Docker network which we will use on all our services, so that all of them will connect on a single network:
# replace `mynetwork` with the network name you wantdocker network create mynetwork
Then, create a compose.yaml
file on our home directory and open it up in VSCode:
code compose.yaml
Create the traefik
service:
services: traefik: image: traefik:v2.11 container_name: traefik # Enables the web UI and tells Traefik to listen to docker command: - '--api.insecure=true' - '--providers.docker=true' ports: # We will expose port 80 to our host machine to intercept requests # from that port and forward it to traefik - '80:80' # The Traefik Web UI (enabled by --api.insecure=true) - '8080:8080' volumes: # So that Traefik can listen to the Docker events - /var/run/docker.sock:/var/run/docker.sock labels: # Optional: To assign a custom domain `traefik.localhost` # to the Traefik Web UI - traefik.docker.network=mynetwork - traefik.http.services.traefik.loadbalancer.server.port=8080 - traefik.http.routers.traefik.rule=Host(`traefik.localhost`) networks: - mynetwork
Wow. This is kinda overwhelming to understand but basically, it just pulls the official Traefik v2.11 image from Docker Hub and we configure it how we want. After this configuration, you don’t even need to touch this unless you want to customize more. You can just refer to their Documentation to learn more.
Now for our dbgate
service:
dbgate: image: dbgate/dbgate:latest container_name: dbgate depends_on: # This will wait until the traefik service is up and running before starting dbgate - traefik ports: # This will expose port 3000 of the container, which DbGate uses, # but Docker will randomly choose an available port on our machine # to assign (so we won't have to) - '3000' volumes: # This will persist your DbGate configuration on container restart - dbgate_data:/root/.dbgate labels: # Assign a custom domain to the service - traefik.docker.network=mynetwork - traefik.http.services.dbgate.loadbalancer.server.port=3000 - traefik.http.routers.dbgate.rule=Host(`dbgate.localhost`) networks: - mynetwork
As you may notice, we declare a volume
on the dbgate
service, so we need to include a top-level volumes
configuration for that:
volumes: dbgate_data: driver: local
We also need to define the Docker network we used on our services:
networks: mynetwork: external: true
And.. that’s it! Here’s our full compose.yaml
services: traefik: image: traefik:v2.11 container_name: traefik # Enables the web UI and tells Traefik to listen to docker command: - '--api.insecure=true' - '--providers.docker=true' ports: # We will expose port 80 to our host machine to intercept requests # from that port and forward it to traefik - '80:80' # The Traefik Web UI (enabled by --api.insecure=true) - '8080:8080' volumes: # So that Traefik can listen to the Docker events - /var/run/docker.sock:/var/run/docker.sock labels: # Optional: To assign a custom domain `traefik.localhost` # to the Traefik Web UI - traefik.docker.network=mynetwork - traefik.http.services.traefik.loadbalancer.server.port=8080 - traefik.http.routers.traefik.rule=Host(`traefik.localhost`) networks: - mynetwork
dbgate: image: dbgate/dbgate:latest container_name: dbgate depends_on: # This will wait until the traefik service is up and running before starting dbgate - traefik ports: # This will expose port 3000 of the container, which DbGate uses, # but Docker will randomly choose an available port on our machine # to assign (so we won't have to) - '3000' volumes: # This will persist your DbGate configuration on container restart - dbgate_data:/root/.dbgate labels: # Assign a custom domain to the service - traefik.docker.network=mynetwork - traefik.http.services.dbgate.loadbalancer.server.port=3000 - traefik.http.routers.dbgate.rule=Host(`dbgate.localhost`) networks: - mynetwork
volumes: dbgate_data: driver: local
networks: mynetwork: external: true
Time to run the services
Alrighty! Let’s spin ‘em all up!
docker compose up -d
What does this command do?
docker
: communicate with the Docker Enginecompose
: use the Compose plugin by Dockerup
: run the services declared on thecompose.yaml
file-d
: run in detached mode (in the background)
It should take a while to pull and build the containers. Probably around 5-10 minutes, depending on your computer and internet speed. Once the build is complete, you should see a success message like this:
✔ Container dbgate Started ✔ Container traefik Started
You should be able to access the services using their custom domains!
traefik
: http://traefik.localhostdbgate
: http://dbgate.localhost
Conclusion
That’s basically it. Now you have a flexible dev environment setup using Docker! To test things out, check out my other posts to setup different frameworks using this setup. Happy coding!