It’s time to call a spade a spade. ChatGPT isn’t just hallucinating. It’s a bullshit machine.
From TFA (thanks @mxtiffanyleigh for sharing):
"Bullshit is ‘any utterance produced where a speaker has indifference towards the truth of the utterance’. That explanation, in turn, is divided into two “species”: hard bullshit, which occurs when there is an agenda to mislead, or soft bullshit, which is uttered without agenda.
“ChatGPT is at minimum a soft bullshitter or a bullshit machine, because if it is not an agent then it can neither hold any attitudes towards truth nor towards deceiving hearers about its (or, perhaps more properly, its users’) agenda.”
https://futurism.com/the-byte/researchers-ai-chatgpt-hallucinations-terminology
I mean it’s all semantics. ChatGPT regurgitates shit it finds on the internet. Often the internet is full of bullshit, so no one should be surprised when CGPT says bullshit. It has no way to parse truth from fiction. Much like many people don’t.
A good LLM will be trained on scientific data and training materials and books and shit, not random internet comments.
If it doesn’t know, it should ask you to clarify or say “I don’t know”, but it never does that. Thats truly the most ignorant part. I mean imagine a person who can’t say “I don’t know” and never asks questions. Like they’re conversing with Kim jong Il. You would never trust them.