Failures / Standard term
Hallucination / confabulation
When a model generates false information and presents it as fact. You ask for the CEO of a company and it confidently names someone who does not exist.
Hallucination is the common industry term. Confabulation, borrowed from psychology, is more precise: the model fills gaps with plausible-sounding text, the same way a person with memory gaps might unknowingly invent details to complete a story. The model has no internal sense of what is true. It predicts text that fits the pattern, and sometimes the most pattern-fitting continuation is a fabrication. The output is especially dangerous because it reads exactly like correct information.
Builder example
In production, hallucination means your users act on wrong information that looks authoritative. A customer support bot that invents a return policy, a legal tool that cites nonexistent case law, a medical assistant that fabricates drug interactions: each one causes real harm and erodes trust in your entire product.
Common confusion: Confident tone and citation-like formatting do not indicate accuracy. Models routinely generate realistic-looking references, URLs, and quotes that are entirely fabricated.