Artificial Intelligence
Hallucination (AI)
Also known as: Hallucination in AI & Model hallucination
Plain English
When AI makes things up.
Definition
Hallucination in AI refers to when a glossaryModelA model is a system or representation used to process data and generate outputs, often trained to perform specific tasks.Open glossary term generates incorrect or fabricated information that appears plausible.
In practice
Common in generative glossaryModelA model is a system or representation used to process data and generate outputs, often trained to perform specific tasks.Open glossary term, especially when asked about unknown or ambiguous topics.
The reality
Hallucinations are a known limitation of current AI glossarySystemA system is a collection of interconnected components that work together to achieve a specific function or outcome.Open glossary term and cannot be fully eliminated.
Also known as
Hallucination in AI & Model hallucination
FAQ
Common questions
A few practical answers to the questions that usually come up around this term.
What is an AI hallucination?
It is when AI generates false or made-up information.
Why do AI hallucinations happen?
Because glossaryModelA model is a system or representation used to process data and generate outputs, often trained to perform specific tasks.Open glossary term predict likely outputs rather than verifying facts.
Are hallucinations dangerous?
They can be if outputs are trusted without verification.
How do you reduce hallucinations?
By using better glossaryPromptA prompt is the input or instruction given to an AI system to guide its output or response.Open glossary term, validation, and external glossaryData SourceA data source is the origin from which data is collected or accessed.Open glossary term.
Related Services
Related Guides
Related Terms