If all you want is something trivial that’s been done by enough people beforehand, it’s no surprise that something approaching correct gets parroted back at you.
That’s 99% of what I’m looking for. If I’m figuring something out by myself, I’m not looking it up on the internet.
I’m an engineer and I’ve found LLMs great for helping me understand an issue. When you read something online, you have to translate from what the author is saying into your thinking and I’ve found LLMs are much better at re-framing information to match my inner dialog. I often find them much more useful than google searches in trying to find information.