Hospitals use a transcription tool powered by a hallucination-prone OpenAI model

Hospitals use a transcription tool powered by a hallucination-prone OpenAI model.

Transcription Tool

Researchers found that the Whisper model frequently invents entire passages of text.

Hallucination Study

Download App

Nabla, the company that uses Whisper, is aware of the hallucination problem.

Nabla's Awareness

Download App

OpenAI acknowledges the issue and is working to improve their model.

OpenAI's Response

Download App