Part of what’s making learning Linux so difficult for me, is the idea of how fragmented it is. You can install programs with sudo apt get (program). You can get programs with snaps. You can get programs with flatpaks. You can install with tar.gz files. You can install with .deb files. You can get programs with .sh files. There’s probably more I don’t know about.
I don’t even know where all these programs are being installed. I haven’t learned how to uninstall them yet. And I’m sure that each way has a different way to uninstall too.
So that brings me to my main question. Why not consolidate all this? Sure, files CAN be installed anywhere if you want, but why not make a folder like /home/programs/ where it’s assumed that programs would be installed?
On windows the programs can be installed anywhere, but the default is C:/windows/Program Files x86/ or something like that. Now, you can change it all you want when you install the programs. I could install it to C:/Fuckfuckfuck/ if I wanted to. I don’t want to, so I leave it alone because C:/Windows/Program Files x86/ is where it’s assumed all the files are.
Furthermore, I see no benefit to installing 15 different programs in 7 different folders. I begrudgingly understand why there’s so many different installation methods, but I do NOT understand why as a collective community we can’t have something like a standardized setting in each distro that you can set 1 place for all your installation files.
Because of the fragmentation of distros, I can understand why we can’t have a standardized location across all distros like Windows has. However I DON’T see why we can’t have a setting that gets set upon each first boot after installation that tells each future installation which folder to install to.
I would personally pick /Home/Programs/, but maybe you want /root/Jamies Files/ because you’re Jamie, and those are your files.
In either case, as we boot up during the install, it would ask us where we want our program files installed. And from then on, no matter what method of install you chose, it would default to whatever your chosen folder was.
Now, you could still install other places too, but you would need to direct that on a per install basis.
So what’s the benefit of having programs each installed in seperate locations that are wildly different?
Why not consolidate all this?
But, there is actually a location standard: https://en.wikipedia.org/wiki/Filesystem_Hierarchy_Standard
What you’re seeing is the result of decades of new ways to install stuff being added at different times. As a new way is added all the old ways still need to work because getting everyone to switch to the new way is impossible. There is no central authority making these decisions, it’s more of a marketplace of ideas with different ‘sellers’ competing for attention.
I might add, everyway actually seek to “consolidate” all the older ways, and always ends up adding to the ways needing to be consolidated.
Obligatory XKCD reference: https://xkcd.com/927/
Linux doesn’t really have stand alone programs like Windows. A package is a series of files that get placed in the proper places plus some optional scripting. Packages have dependencies so you can’t just run a binary from a package. The closest thing Linux as is AppImage but it lost a lot of steam.
In Linux there are two general types of package managers. The first one is native packages. Native packages install to the root filesystem and are part if the core system.
The second type of package manager is the portable format like Flatpak. Flatpaks can either be installed system wide or as a local user. The big difference is that they run in there own environment and have limited permissions. This is done by creating a sandbox that has its own filesystem so that it is independent of the system. This is also what makes them portable as that environment is the same no matter what.
Technically snap packages are portable but you aren’t going to see much use outside of Ubuntu since the underlying architecture has so many flaws.
This is sometimes true.
Go and Rust both (often) build single-executable binaries, often with very few (and, rarely, no) dependencies. It’s becoming more rare for developers to include proper man pages, more’s the pity, but things like man pages, READMEs, and LICENSE files are often the only assets packages from these languages include.
If you’re installing with Cargo or go install, then even the intermediate build assets are fairly well-contained; go install
hides binaries quite effectively from users who don’t know to include GOPATH(/bin) in their paths, because Go puts everything into a single tree rooted there.
Libraries are a different matter; you get headers and usually more documentation. Interpreted languages are as you say: a great pile of shit spewed all over your system, and how bad that can be depends a lot on how many different ways you install them.
Anyway, I’m not disagreeing with you, except that it’s a trend for newer compiled languages to build stand-alone binaries that you can usually just copy between systems and have it work.
Now does flatpak get it’s programs from the same place that terminal would? I’m still trying to grasp what’s even happening here. Because from my experience (limited) I like flatpaks more than any other method used so far, and am unclear why anyone would use terminal if given the choice.
As for snaps, I heard Ubuntu owns the technology behind snaps, and for some reason everybody hates snaps because canonical owns it. Which I don’t get. As far as I know they don’t abuse snaps, and they don’t cause viruses or anything. So why would it matter who owns the technology behind them?
everybody hates snaps because canonical owns it
We like of like things to be open so that we can review, or replace. The snap store is proprietary and controlled by canonical. I don’t want my data collected and subject to canonical’s EULA when using my choice of distro.
Canonical has a hisory of doing bad choices, so the level of trust is not very high. It feels like an attempt at embrace, extend, extinguish. Get people hooked on snaps and then make snaps suck on other distros kind of thing.
The reason most people don’t like snaps is fairly complicated. It started with Ubuntu forcing some basic packages to install as a snap instead of a native package. The thing is snaps are not native packages and because of this it caused major problems. These days a lot of the issues have been addressed but there are still some serious design flaws. The biggest issue is that it is way overly complex and depends on a privileged daemon. The result of this is poor performance and a clunky experience.
Now does flatpak get it’s programs from the same place that terminal would?
I usually install Flatpaks from the terminal, but as to your question: no, the distro’s package manager and Flatpak have different repositories (servers with software packages) and formats. While distros like Fedora have their own Flatpak repositories, most people use Flathub. You can install apps as Flatpak on any distro that supports them, but native package managers generally don’t support other distros’ repositories.
for some reason everybody hates snaps because canonical owns it.
As I understand it, Snap server software is proprietary and doesn’t support independent repositories, so you have to install Snaps from Canonical. This is not exactly in line with Free (as in Freedom) Software principles. Canonical has done many questionable decisions in the past.
This is part of why these days I just stick to flatpaks. No fragmentation, same on any distro, I know where all the programs are going as well as all their config files.
If I want to back up my flatpaks I can do so trivially.
It’s a godsend. Way better than having a bunch of different formats everywhere, or the windows-style some programs installed in XYZ directory, some in program files, some in program files (x86), with config files saved literally anywhere. Maybe it’s in one of the dozens of poorly laid out appdata folders, maybe it’s where the exe is, maybe it’s in documents, maybe it’s in C:, maybe it’s hidden in my user directory, etc. I’ve even seen config files saved to bloody onedrive by default, leading to some funky app behaviour when I wasn’t connected to the internet, or when I ran out of onedrive space.
No, because docker shows up in random places in your system and takes forever to set up compared to the actual program
There’s a few repos for online management consoles and the original version used a .sh file that installed in 30 seconds on a single core free VPS. The docker version was like two minutes and when I uninstalled it, I still have traces of docker on the VPS
Windows certainly doesnt have uniformity.
Where are my game saves located?
Are they in my hidden AppData folder? If so, which of the three subdirectories does it live?
If not there, then surly it’s in the Saved Games folder.
Nope. It must be in My Documents.
Shit… Maybe in the Program Files?
For fuck sake, where is it?!
Web browser > search > pcgamingwiki (great resource BTW), save game location. AH-HA! IT’S IN… My Documents?
I just checked there! (Half an hour passes)
Found it! Now why the FUCK does Windows partition the local user directory from the OneDrive user directory?!
Windows is a FUCKING mess. Once you get used to Linux, you’ll understand the worst thing is Mozilla thinks it’s okay to put its config file one directory up from where it should be.
@Zozano @Lost_My_Mind
seriously, I have been gaming for more than 30 years and if you want me to swear I’ll swear, for 25 years I have been gaming pirated games, on the V1.0.0 of a random game, the save file is located on My Documents\My Saved Files, then after installing the update 1.1.0 the saved progress is gone, after much research, I find it under my docs\the game name, then after 1.2.0, I find it under App Data\local… and so on and every time the pseudo changes+ the path
@Zozano @Lost_My_Mind so I lose much much time researching and since they were pirated games, support was poor so I even searched the dark web (not the dark dark web but I call the dark web the 3rd+ page of google)