Skip to main content
BETAThis is a new design — give feedback

Hallucination

Also known as: AI Hallucination, Confabulation

In one sentence

When an AI model confidently generates false, fabricated, or nonsensical information as if it were fact. The model isn't lying—it's producing statistically plausible text that happens to be wrong.

Explain like I'm 12

Imagine a friend who always gives you an answer, even when they don't actually know. They're not trying to trick you—they just fill in gaps with their best guess and sound really sure about it.

In context

Hallucinations show up in many forms. ChatGPT might invent academic citations that don't exist, complete with fake authors and journal names. A coding assistant could reference API methods that were never part of a library. Google's Bard once stated the James Webb Space Telescope took the first picture of an exoplanet, which was incorrect. These errors are especially dangerous because the AI's confident tone makes them hard to spot without fact-checking.

See also

Related Guides

Learn more about Hallucination in these guides: