Not an advertisement I swear, but I honestly love language models, I specifically use Chat GPT and Gemini ALL the time.
I’m someone who loves research and learning. I always have questions on all sorts of things. Before GPT I would be googling things for hours each day, visiting forums, watching YouTube videos, etc.
Now with GPT I can ask so many questions on things I am curious about, and even ask follow up questions or ask for more in depth explanations.
I think I am also a bit ADHD and so my brain is always jumping around different topics which makes me curious on specific things. My latest is insects, specifically wasps.
I absolutely hate bugs, and wasps are the worst. But I am now learning more and more how important a lot of bugs actually are to the ecosystem. It’s really a great way to learn and engage with topics when you can ask follow up questions or ask for more details on specific aspects.
TLDR: GPT has replaced 90% of my research that Google used to be.
Important Note: Please be aware language models can be innacurate and prone to mistakes, so always verify the data from other sources if you need accurate information and not just general knowledge.
Thankfully for general knowledge questions it gets things correct. A lot of what I ask or am curious about is general knowledge so it works really well for me
It really doesn’t get general knowledge correct.
AI will often confuse itself because the answers are probabilistic. So it often is right, but just as often is hallucinating random BS, because AI doesn’t really know shit. It just detects patterns.
I just saw a post on Reddit about GPT claiming Argentina is the second most populated country in South America (which is false, it’s Colombia), but since Argentina is usually second place in most lists in the region, GPT tends to place Argentina in there anyway.
My problem used to be that I started like that, and then I wanted to ask more an then another bit more, and that’s when things get messy. And at some point I usually end up annoyed that half of the conversation was based on a lie. Now I barely open the chat because of that.