It’s time to call a spade a spade. ChatGPT isn’t just hallucinating. It’s a bullshit machine.
From TFA (thanks @mxtiffanyleigh for sharing):
"Bullshit is ‘any utterance produced where a speaker has indifference towards the truth of the utterance’. That explanation, in turn, is divided into two “species”: hard bullshit, which occurs when there is an agenda to mislead, or soft bullshit, which is uttered without agenda.
“ChatGPT is at minimum a soft bullshitter or a bullshit machine, because if it is not an agent then it can neither hold any attitudes towards truth nor towards deceiving hearers about its (or, perhaps more properly, its users’) agenda.”
https://futurism.com/the-byte/researchers-ai-chatgpt-hallucinations-terminology
I work with software and my coworkers will occasionally tell me they ran something by ChatGPT instead of just reading the documentation. Every time it’s a bullshit waste of everyone’s time.
“Indifference” is a strange word to apply here. What would it mean for ChatGPT to “want” to be accurate?
Apple’s guy in charge of those systems called “hallucinations” exactly that last night in an on-stage interview with John Gruber.
I think “hallucinating” and “bullshitting” are pretty much synonyms in the context of LLMs. And I think they’re both equally imperfect analogies for the exact same reasons. When we talk about hallucinators & bullshitters, we’re almost always talking about beings with consciousness/understanding/agency/intent (people usually, pets occasionally), but spicy autocompleters don’t really have those things.
But if calling them “bullshit machines” is more effective communication, that’s great—let’s go with that.
To say that they bullshit reminds me of On Bullshit, which distinguishes between lying and bullshitting: “The main difference between the two is intent and deception.” But again I think it’s a bit of a stretch to say LLMs have intent.
I might say that LLMs hallunicate/bullshit, and the rules & guard rails that developers build into & around them are attempts to mitigate the madness.
@davel @ajsadauskas I enjoy the bullshitting analogy, but regression to mediocrity seems most accurate to me. I think it makes sense to call them mediocrity machines. (h/t @ElleGray)