Can’t help but notice that you’ve cropped out your prompt.
Played around a bit, and it seems the only way to get a response like yours is to specifically ask for it.
Honestly, I’m getting pretty sick of these low-effort misinformation posts about LLMs.
LLMs aren’t perfect, but the amount of nonsensical trash ‘gotchas’ out there is really annoying.
Here’s my first attempt at that prompt using OpenAI’s ChatGPT4. I tested the same prompt using other models as well, (e.g. Llama and Wizard), both gave legitimate responses in the first attempt.
I get that it’s currently ‘in’ to dis AI, but frankly, it’s pretty disingenuous how every other post about AI I see is blatant misinformation.
Does AI hallucinate? Hell yes. It makes up shit all the time. Are the responses overly cautious? I’d say they are, but nowhere near as much as people claim. LLMs can be a useful tool. Trusting them blindly would be foolish, but I sincerely doubt that the response you linked was unbiased, either by previous prompts or numerous attempts to ‘reroll’ the response until you got something you wanted to build your own narrative.
I love this lmao
When chatgpt calls you the rizzler you know we living in the future
So do have like a mastodon where you post these? Because that’s hilarious
Especially since the stats saying that they’re wrong about 53% of the time are right there.
That’s right around 9% lower than the statistic that 62% of all statistics on the Internet are made up on the spot!
I wish I had the source on hand, but you’ll just have to trust my word - after all, 47% of the time, it’s right 100% of the time!
Joking aside, I do wish I had the link to the study as it was cited in an article from earlier this year about AI making stuff up even when it cited sources (literally lying about what was in the sources it claimed it got the info from) and how the companies behind these AI collectively shrugged their shoulders and said “there’s nothing we can do about it” when asked what they intend to do about these “hallucinations,” as they call them.