It's Not Only AI That Hallucinates
Making Stuff Up Is The Most Human Thing Chat GPT Does
You will be familiar with the phenomenon of ‘hallucinations’ - if you use AI, as I do frequently. That is to say, if Chat GPT doesn’t really know the answer to something it just makes shit up and announces it in as convincing a way as possible.
I guess AI and humans have this much in common. - although ‘they’ are much more intelligent than us carbon based life forms. They just hate to admit that they don’t know something.
I can understand very well why humans consistently do this. It’s because they actually don’t know anything very much and this makes them feel terribly worried and insecure, since everyone else is pretending to be certain about everything, and if you don’t know anything, you can’t predict what’s about to happen with any degree of certainty, or even know what just happened. Mysteries baffle the human brain, and since all is mystery, these brains are engaged in a constant fantastical battle to convince themselves that the knowledge they think they ‘hold’ - memories, predictions, meanings, facts - is valid. The more uncertain they - deep in their heart - are, the more fanatical they become about insisting that their knowledge is ‘true’.



