Skip to content
Definition

Hallucination (AI)

When AI generates false or made-up information that sounds plausible.

Full Definition

An AI hallucination occurs when a language model generates false, fabricated, or inaccurate information that sounds plausible and confident. Hallucinations are a known limitation of large language models and can include invented facts, false attributions, or completely made-up details. For brands, hallucinations present a risk—AI might generate incorrect information about your products, pricing, or capabilities. Monitoring AI responses for hallucinations about your brand is an important aspect of AI brand safety.

Related Terms

Tools & Resources

Monitor Your AI Visibility

See how ChatGPT, Claude, and Perplexity mention your brand.

Free AI Visibility Check