I can only speak for myself, but I have always had bad luck with Linux on desktop. Something always breaks, isn’t compatible, or requires a lengthy installation process involving compiling multiple libraries because no .deb or .rpm is available.
On servers, it’s fantastic. If you count VMs, I have far more Linux installations than Windows. In general, I use Win10 LTSC for anything that requires a GUI and Ubuntu Server for anything that only needs CLI or hosts a web interface.
Try Pop_OS!, it just works.
Ironic that Windows has become the same way. New functionality is available first as a Powershell command before the GUI control is written. This is because those are two efforts. First you write the function then you need to call the function from a GUI element.
Ironic #2 is that Pop_OS comes with more settings available in the GUI than any other Linux I have used. Maybe you haven’t tried it.
To say no distro can fix is nonsense. Any distro can make new GUI elements and because it’s open source once the work is done other distros can add the same to their own menus.
Just like it has taken Microsoft over a decade to develop the new settings app, they still haven’t achieved feature parity with the control panel. This should make obvious how much hard work is required.
So the solution is that we just need to write more GUI menus for linux and I’m fine with that. It’s nice to have the option to use a menu or edit the text file. Then everyone gets what they want.
At least for me, the whole “made by devs for devs” isn’t really the major downfall. It’s the fact that it can’t be trusted to remain functional in a dynamic environment. I like using the command line, but sometimes that’s just not enough.
If I need a specific software package, I can download the source, compile it, along with the 100 of libraries that they chose not to include in the .tar.gz file, and eventually get it running.
However, when I do an “apt update” and it changes enough, then the binary I compiled earlier is going to stop working. Then I spend hours trying to recompile it along with it’s dependencies, only to find that it doesn’t support some obscure sub-version of a package that got installed along with the latest security updates.
In a static environment, where I will never change settings or install software (like my NAS), it’s perfect. On my desktop PC, I just want it to work well enough so I can tinker with other things. I don’t want to have to troubleshoot why Gnome or KDE isn’t working with my video drivers when all I want to do is launch remote desktop so I can tinker with stuff on a server that I actually want to tinker with.
My experience with Arch and BTRFS has been nothing but great. If my system break I can just roll back a snapshot.
I avoid Debian, Ubuntu or other distros that hold back package versions because that’s where the problem starts in my opinion. I shouldn’t have to use workarounds to install the packages I want. Arch with the AUR just work so far.