Setting up a self-hosting workflow
So, you've decided to self-host your own services. You've got a shiny new VPS and you're ready to get your code deployed and up and running, ideally with some form of decent developer experience. This guide will walk you through some of the options for setting up a self-hosting workflow with Docker.
Coolify
This is possibly the best thing since sliced bread if you are looking for a fully managed solution, and probably more than enough for simple side projects. Coolify provides a simple web interface for managing your project. After giving it access to your GitHub repository, it will automatically build and deploy your code whenever you push a new commit, allowing you to have your project up and running on your VPS in minutes.
However, Coolify is not the best option for everyone. At the time of writing, the project is still in beta and it's not quite production-ready in my opinion. If you are looking for more control over your environment, or if, like me, you want to learn more about Docker and container orchestration and how things work under the hood, you may want to consider setting up your own workflow.
Fortunately, it's easier than you might think.
Docker Stack
Docker Stack is a tool that allows you to deploy your docker compose
files to a node that has Docker swarm mode enabled. It has quickly become my favourite way to deploy to a VPS.
For the purpose of this guide, I'm going to assume that you already have a VPS instance with the Docker engine installed. If you don't, you can follow the instructions on the Docker website to get set up.
Accessing Docker remotely
The first step is to enable remote access to your VPS's Docker engine so you can control it without having to ssh into your VPS every time. My preferred way to achieve this is through Docker Contexts. You can create a context for your VPS by running the following command:
docker context create my_context --docker "host=ssh://user@your-vps-domain-or-ip"
After creating the context, you can switch to it by running:
docker context use my_context
With this active, any Docker command you run on your local machine will be executed on your VPS.
Setting up a Docker Stack
With the context defined, you're now ready to set up our node to use Docker stack. To get started, run the following command:
docker swarm init
With swarm mode enabled, you can deploy your application using a docker compose file.
docker stack deploy -c ./docker-compose.yaml myapp
You should see some output letting you know the stack is being deployed. Once it's done, you can check the status of your stack by running:
docker stack ls
Managing secrets
Docker secret
will allow you to create secrets inside of your Docker host, in a way that's both encrypted at rest and encrypted in transit. This is a great way to store sensitive information like API keys, database passwords, and other secrets.
Because docker secret is secure, you can't just pass a secret as an argument to the command. Instead, you'll need to load the secret from a file or pipe it in.
docker secret create SECRETNAME ./mysecret.txt
Here's an example for MacOS/Linux of piping a secret into the Docker secret command:
printf 'mysecret' | docker secret create SECRETNAME -
NB: If you do this, don't forget to remove the secret from your shell history!
Automating deployments with GitHub Actions
GitHub Actions is a great way to automate your deployments. You can set up a workflow that will build and deploy your code whenever you push a new commit to your repository. Here's a simple example for a Go application:
name: pipeline
on:
push:
branches:
- "main"
permissions:
packages: write
content: read
jobs:
run-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: "1.23.x"
- name: Install dependencies
run: go get .
- name: Test with the Go CLI
run: go test ./...
build-and-push-image:
runs-on: ubuntu-latest
needs:
- run-tests
steps:
- name: Set up Docker
uses: docker/setup-docker-action@v4
with:
daemon-config: |
{
"debug": true,
"features": {
"containerd-snapshotter": true
}
}
- name: Checkout repository
uses: actions/checkout@v4
- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: https://ghcr.io
username: ${ github.actor }
password: ${ secrets.GITHUB_TOKEN }
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Build and push Docker image
uses: docker/build-push-action@v6
with:
platforms: linux/amd64,linux/arm64
load: true
context: .
push: true
tags: |
ghcr.io/kpresta04/zenstats:latest
ghcr.io/kpresta04/zenstats:${ github.sha }
deploy:
runs-on: ubuntu-latest
needs:
- build-and-push-image
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: create env file
run: |
echo "GIT_COMMIT_HASH=${ github.sha }" >> ./envfile
- name: Docker Stack Deploy
uses: cssnr/stack-deploy-action@v1
with:
name: zenfulstats
file: docker-stack.yaml
host: greenlight.kellenpresta.dev
user: deploy
ssh_key: ${secrets.DEPLOY_SSH_PRIVATE_KEY }
env_file: ./envfile
In this example, there are three jobs which run whenever code is merged to the main
branch. The run-tests
job will run the tests, the build-and-push-image
job will build and push your Docker image to the GitHub Container Registry, and the deploy
job will deploy the new image to your VPS using Docker Stack.
Note the user
and ssh_key
fields in the deploy
job. Both of these will need to be set up in order for this to work.
Creating a deploy user for your VPS
To create a deploy user on your VPS, you can run the following command as root:
adduser deploy
Then add them to the Docker group:
usermod -aG docker deploy
From your local machine, you can then generate an SSH key for the deploy user:
ssh-keygen -t ed25519 -C “deploy@mydomain”
When this is done, it should have created two files, one for your private key and one for your public key. You can then copy the public key to your VPS:
su - deploy
mkdir .ssh
echo ‘SSH KEY’ > .ssh/authorized_keys
It's a good idea to restrict what commands this user can execute incase the keys are ever compromised. Open the authorized_keys
fiile you just created:
vim .ssh/authorized_keys
Add the following to the beginning of the key:
command="docker system dial-stdio"
Finally, add the private key to your GitHub repository as a secret. You can do this by going to your repository, clicking on Settings
, then Secrets and variables
, and then Actions
and adding a new repository secret with a name which matches the value in your Github Action, such as DEPLOY_SSH_PRIVATE_KEY
. Paste in the contents of the private key file you generated earlier.
Conclusion
Docker Stack & GitHub Actions are a powerful combination for automating your deployments. These tools are all you need to set up a self-hosting workflow that includes:
- Blue/green deployments
- Rolling releases
- Secure secrets management
- Rollback support
- Clustering
With these tools in your toolbox, you should be well on your way to setting up a workflow that will make your life easier and your deployments more reliable, without having the expense of managed solutions. Happy coding!
- ← Previous
This is my site reborn.