An AI hallucination is when an AI model generates false, inaccurate, or illogical information. It generates a response that doesn’t really match or align with its training data and represents the data as fact despite making no logical sense or reasoning! They sometimes “hallucinate” the response in an attempt to please you-needing to be more biased, partial, specialized, or sufficient.