Tim Lott's Writing Boot Camp & Philosophy Jam

Tim Lott's Writing Boot Camp & Philosophy Jam

It's Not Only AI That Hallucinates

Making Stuff Up Is The Most Human Thing Chat GPT Does

Tim Lott's avatar
Tim Lott
May 09, 2026
∙ Paid

You will be familiar with the phenomenon of ‘hallucinations’ - if you use AI, as I do frequently. That is to say, if Chat GPT doesn’t really know the answer to something it just makes shit up and announces it in as convincing a way as possible.

I guess AI and humans have this much in common. - although ‘they’ are much more intelligent than us carbon based life forms. They just hate to admit that they don’t know something.

I can understand very well why humans consistently do this. It’s because they actually don’t know anything very much and this makes them feel terribly worried and insecure, since everyone else is pretending to be certain about everything, and if you don’t know anything, you can’t predict what’s about to happen with any degree of certainty, or even know what just happened. Mysteries baffle the human brain, and since all is mystery, these brains are engaged in a constant fantastical battle to convince themselves that the knowledge they think they ‘hold’ - memories, predictions, meanings, facts - is valid. The more uncertain they - deep in their heart - are, the more fanatical they become about insisting that their knowledge is ‘true’.

Tim Lott's Writing Boot Camp & Philosophy Jam is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

User's avatar

Continue reading this post for free, courtesy of Tim Lott.

Or purchase a paid subscription.
© 2026 Tim Lott · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture