Hello! 😀
I want to share my thoughts on docker and maybe discuss about it!
Since some months I started my homelab and as any good “homelabing guy” I absolutely loved using docker. Simple to deploy and everything. Sadly these days my mind is changing… I recently switch to lxc containers to make easier backup and the xperience is pretty great, the only downside is that not every software is available natively outside of docker 🙃
But I switch to have more control too as docker can be difficult to set up some stuff that the devs don’t really planned to.
So here’s my thoughts and slowly I’m going to leave docker for more old-school way of hosting services. Don’t get me wrong docker is awesome in some use cases, the main are that is really portable and simple to deploy no hundreds dependencies, etc. And by this I think I really found how docker could be useful, not for every single homelabing setup, and it’s not my case.

Maybe I’m doing something wrong but I let you talk about it in the comments, thx.

1 point

I don’t like docker. It’s hard to update containers, hard to modify specific settings, hard to configure network settings, just overall for me I’ve had a bad experience. It’s fantastic for quickly spinning things up but for long term usecase and customizing it to work well with all my services, I find it lacking.

I just create Debian containers or VMs for my different services using Proxmox. I have full control over all settings that I didn’t have in docker.

permalink
report
reply
1 point

Use portainer + watchtower

permalink
report
parent
reply
1 point

the old good way is not that bad

permalink
report
parent
reply
9 points

What do you mean it’s hard to update containers?

permalink
report
parent
reply
6 points

For real. Map persistent data out and then just docker compose pull && up. Theres nothing to it. Regular backups make reverting to previous container versions a breeze

permalink
report
parent
reply
0 points

For one, if the compose file syntax or structure and options changes (like it did recently for immich), you have to dig through github issues to find that out and re-create the compose with little guidance.

Not docker’s fault specifically, but it’s becoming an issue with more and more software issued as a docker image. Docker democratizes software, but we pay the price in losing perspective on what is good dev practice.

permalink
report
parent
reply
7 points

Docker is a convoluted mess of overlays and truly weird network settings. I found that I have no interest in application containers and would much prefer to set up multiple services in a system container (or VM) as if it was a bare-metal server. I deploy a small Proxmox cluster with Proxmox Backup Server in a CT on each node and often use scripts from https://community-scripts.github.io/ProxmoxVE/. Everything is automatically backed up (and remote sync’d twice) with a deduplication factor of 10. A Dockerless Homelab FTW!

permalink
report
reply
1 point

Yeah I share your point of view and I think I’m going this way. These scripts are awesome but I prefer writing mine as I get more control over them

permalink
report
parent
reply
30 points

It’s hard for me to tell if I’m just set in my ways according to the way I used to do it, but I feel exactly the same.

I think Docker started as “we’re doing things at massive scale, and we need to have a way to spin up new installations automatically and reliably.” That was good.

It’s now become “if I automate the installation of my software, it doesn’t matter that the whole thing is a teetering mess of dependencies and scripted hacks, because it’ll all be hidden inside the container, and also people with no real understanding can just push the button and deploy it.”

I forced myself to learn how to use Docker for installing a few things, found it incredibly hard to do anything of consequence to the software inside the container, and for my use case it added extra complexity for no reason, and I mostly abandoned it.

permalink
report
reply
10 points

I hate how docker made it so that a lot of projects only have docker as the official way to install the software.

This is my tinfoil opinion, but to me, docker seems to enable the “phone-ification” ( for a lack of better term) of softwares. The upside is that it is more accessible to spin services on a home server. The downside is that we are losing the knowledge of how the different parts of the software work together.

I really like the Turnkey Linux projects. It’s like the best of both worlds. You deploy a container and a script setups the container for you, but after that, you have the full control over the software like when you install the binaries

permalink
report
parent
reply
11 points

I hate how docker made it so that a lot of projects only have docker as the official way to install the software.

Just so we are clear on this. This is not dockers fault. The projects chose Docker as a distribution method, most likely because it’s as widespread and known as it is. It’s simply just to reach more users without spreading too thin.

permalink
report
parent
reply
1 point

Yeah, but it is hard to separate that, and it’s easy to get a bit resentful particularly when a projects quality declines in large part because they got lazy by duct taping in container registries instead of more carefully managing their project.

permalink
report
parent
reply
4 points

You are right and I should have been more precise.

I understand why docker was created and became popular because it abstracts a lot of the setup and make deployment a lot easier.

permalink
report
parent
reply
2 points

I agree with it, docker can be simple but can be a real pain too. The good old scripts are the way to go in my opinion, but I kinda like the lxc containers in docker, this principle of containerization is surely great but maybe not the way docker does… (maybe distrobox could be good too 🤷 )

Docker is absolutely a good when having to scale your env but I think that you should build your own images and not use prebuild ones

permalink
report
parent
reply
10 points
*

Honestly after using docker and containerization for more than a decade, my home setups are just yunohost or baremetal (a small pi) with some periodic backups. I care more about my own time now than my home setup and I want things to just be stable. Its been good for a couple of years now, without anything other than some quick updates. You dont have to deal with infa changes with updates, you dont have to deal with slowdowns, everything works pretty well.

At work its different Docker, Kubernetes, etc… are awesome because they can deal gracefully with dependencies, multiple deploys per day, large infa. But ill be the first to admit that takes a bit more manpower and monitoring systems that are much better than a small home setup.

permalink
report
reply
3 points

yeah I think that at the end even if it seems a bit “retro” the “normal install” with periodic backups/updates on default vm (or even lxc containers) are the best to use, the most stable and configurable

permalink
report
parent
reply
1 point

How isit lore stable or configurable? I have docker containers running backup the my folder daily where all the data lives off-site. Also backup the whole container daily onsite. I have found it so easy. I admit it was a pain to learn but after everything was moved over it has been easier.

permalink
report
parent
reply
1 point

Do you use any sort of RAID? Recently, ive been using an old SSD, but back 9ish years ago, I used to backup everything with a RAID system, but it took too much time to keep up.

permalink
report
parent
reply
4 points

I have a RAID 1 on the proxmox host to backup vms and their datas

permalink
report
parent
reply
5 points

I tend to also agree with your opinion,but lately Yunohost have quite few broken apps, they’re not very fast on updates and also not many active developers. Hats off to them though because they’re doing the best they can !

permalink
report
parent
reply
4 points

I have to agree, the community seems to come and go. Some apps have daily updates and some have been updated only once. If I were to start a new server, I would probably still pick yunohost, but remove some of the older apps as one offs. The lemmy one for example is stuck on a VERY old version. However the GotoSocial app is updated every time there is an update in the main repo.

Still super good support for something that is free and open source. Stable too :) but sometimes stability means old.

permalink
report
parent
reply
4 points

Didn’t really tried YunoHost it’s basically a simple selfhostable cloud server?

permalink
report
parent
reply
2 points

I like reminding people that with every new technology, the old one is still around. The new gets most of the attention, but the old is still kicking. (We still have wire wrapped programs kicking around.)

You are all good. Spend your limited attention on other things.

permalink
report
reply

Selfhosted

!selfhosted@lemmy.world

Create post

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.

Rules:

  1. Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

Community stats

  • 3.7K

    Monthly active users

  • 2K

    Posts

  • 23K

    Comments