top of page
Harrison A.

What are AI Hallucinations?



An AI hallucination is when an AI model generates a response or forecast but it has no training data that it would allow it to confidently do so.


For example, a chatbot may respond with a company’s earnings by randomly generating a “plausible” value but it did not have the true value in its training data.


Hallucinations can be very dangerous for consumer facing AI models, especially those that feel trustworthy and convincing. XAI can aid in the verification of responses from an AI model, by citing sources (or specific features) alongside an answer/forecast.


It’s key to be aware of AI Hallucinations when working with AI as AI models become more sophisticated and easier to trust.


12 views0 comments

Comments


bottom of page