GEO Glossary

H

Hallucination (LLM)

When an AI model generates false or invented information as if it were factual.