Hello nerds! I’m hosting a lot of things on my home lab using docker compose. I have a private repo in GitHub for the config files. This is working fine for me, but every time I want to make a change I have to push the changes, then ssh to the lab, pull the changes, and run docker compose up
. This is of course working fine, but I want to automate it.
Does anyone have a similar setup and know of a good tool? I know I could use watchtower to update existing images, but this is more for if I change a setting or add a new service.
I’ve considered roughly four approaches.
-
A new container that mounts the whole running directory and the docker socket. It will register a webhook in GitHub to receive notifications when I push to the repo, run git pull and docker up. My worries here are the usual dind gotchas.
-
Same as 1, but don’t mount anything, instead ssh from container to host and run the steps there. This solves any dind issues, but I don’t love giving the container an ssh key to the host.
-
Have a service running on the host outside of docker. This is probably the correct approach, but very annoying since my host is a Synology nas and it doesn’t have systemd or anything like that afaik.
-
Have a GitHub action ssh to the machine and do the steps. Honestly the easiest way but I would prefer to not open ssh to the internet.
Any feedback or tips are much appreciated. I don’t feel like any of my options are very good and I feel like I am probably missing something obvious.
I’d be a bit concerned with having the git repo also be hosted on the machine itself. If the drives break it’s all gone. I could of course have two remotes but then pushing changes still becomes a multi step procedure.
Backup mate. Either local or something over the network. When comes to data loss, it will come find you eventually.
I do have nightly off-site backups, that’s true. Still, having the git repo be on the same machine doesn’t seem right to me.