Frequently Asked Questions

General

Using the Tools

What if GPT gives wrong or hallucinated answers?

GPT models may occasionally generate false or inaccurate information. Below is one real example of a hallucinated answer from an AI session, shown using two screenshots for context:

Hallucinated GPT Response 1
This hallucination incorrectly described a company’s financial ratio that doesn’t exist.
Hallucinated GPT Response 2
This response created a made-up news event that never occurred.

Privacy & Security