Skip to main content
BETAThis is a new design — give feedback

Grounding

Also known as: Grounded Generation, Factual Grounding

In one sentence

Connecting AI outputs to verified sources or real data to reduce hallucinations and ensure responses are factually accurate and verifiable.

Explain like I'm 12

Making sure the AI doesn't just make stuff up by requiring it to back up its answers with real facts from trusted sources—like citing your sources in a school report instead of writing whatever sounds right.

In context

Grounding is how companies make AI outputs trustworthy. RAG (Retrieval-Augmented Generation) is the most common grounding technique—it retrieves relevant documents and feeds them to the AI so responses are based on actual data rather than the model's training memory. Google's Gemini grounds responses in live search results. Microsoft's Copilot cites specific documents when answering questions about your files. Perplexity AI shows numbered citations for every claim. Without grounding, AI models rely solely on patterns learned during training, which can be outdated or simply wrong.

See also

Related Guides

Learn more about Grounding in these guides: