But this is some Docker shit. For myself Docker always feels a little corporate. Itâs just not very conventional with these multiline commands just to run a command inside a container. Especially the obligatory â-itâ to fucking see anything. Itâs not really straight forward. But if you get used to it and you can make a lot of aliases to use it more easily.
Docker always feels a little corporate.
I work in an âessential serviceâ environment for my main gig, where lots of checks and cross-checks need to exist. And itâs one thatâs been under constant low-grade attack forever as it contains a LOT of tasty PII (personal info) and therefore has regs hammering it into shape. Docker cannot play here - and neither can Debian, actually, nor its derivatives - because it lacks the signed validation available in peer products sharing its space. As soon as the adults show up and notice a product with reduced validation is in place where a better one exists, the people owning that system have to write a life-cycle plan to upgrade, and itâs reviewed at an almost punitive frequency.
So, if youâre saying itâs a little too Corporate, Iâm thinking you mean âsuits and power lunchesâ and not âlarge scale management of crucial systems and essential dataâ. True?
Tbh, Iâve never worked in such an environment. I know somebody who told me similar things and I would love to hear more about this to form my own opinion on this. But itâs just not that deep. When I say corporate, I mean itâs full of GUIDs and only machine-readable names, commands and configs. Itâs also most of the time not designed with the flexibility in mind and covers only the most commonly (used by the company supporting it) use cases. It just doesnât have the free spirit which most of the open source tools, which are designed with humans in mind, have. If you need to supply a parameter to get output from a command that is often run manually while you could also have one to deactivate output for script usage. This seems like the wrong way to go.