06 70 33 24 905
News
OpenAI Whisper Faces Bad Press for Hallucinations in Healthcare Transcription
2024.11.20

Mainstream media sources have picked up on a challenge that could have implications for AI implementation in healthcare settings — including AI-powered translation and interpreting.

An October 26, 2024 report by the Associated Press found that OpenAI’s transcription tool, Whisper, is plagued by hallucinations — i.e., fabricated content.

“While most developers assume that transcription tools misspell words or make other errors, engineers and researchers said they had never seen another AI-powered transcription tool hallucinate as much as Whisper,” the article read.

The stakes are particularly high when it comes to hallucinations in medical transcription. 

According to the AP, more than 30,000 clinicians and 40 healthcare systems have already begun using a tool based on Whisper. Nabla, the company behind the Whisper-powered tool, told the AP that the product has been used for about 7m medical visits so far. 

“Yeahhhhhhh, maybe we should wait on this one,” commented Tony Brown, a self-described communication coach for healthcare professionals, on LinkedIn.

Nabla’s tool “reportedly erases original audio recordings ‘for data safety reasons,’” WIRED noted in its own coverage, pointing out that this practice could be particularly problematic for deaf patients, who would not be able to check a transcript against the audio.

The same limitation might be faced by patients with limited English proficiency (LEP), who represent a major demand driver for language services, especially interpreting, in the US healthcare system.

In addition to monolingual transcription, Whisper can also perform multilingual transcription, a more complex task unlikely to decrease the instances of hallucination. 

The Verge noted that one peer-reviewed study found that “hallucinations disproportionately occur for individuals who speak with longer shares of non-vocal durations.” Might this also be the case during encounters facilitated through consecutive interpreting, in which speakers sometimes pause between taking turns?

The hallucinations seem to be widespread: TechCrunch spoke with a developer who created 26,000 transcriptions using Whisper — only to find hallucinations in nearly every one. 

Hallucinations could range from “racial commentary to imagined medical treatments,” TechCrunch added. 

2024 Cover Slator Pro Guide Translation AI

2024 Slator Pro Guide: Translation AI

The 2024 Slator Pro Guide presents 20 new and impactful ways that LLMs can be used to enhance translation workflows.

OpenAI’s technology has already made a name for itself, primarily through ChatGPT and its offshoots. The spotlight on Whisper’s shortcomings comes as OpenAI launched a real-time speech translation API. It remains to be seen whether users will be put off by the now-publicized drawbacks.

Minnesota’s Enterprise Translation Office, for instance, has already used OpenAI to establish a new translation workflow and is currently exploring a “limited pilot project” on ChatGPT’s voice capabilities for real-time interpretation. 

Demand for healthcare interpreting in the US continues to grow as the nation’s population ages and becomes more diverse. In September 2024, the US House of Representatives unanimously passed a bill to expand language access for telehealth services — a setup into which automated transcription could conceivably be integrated.

Source. slator.com

Languages

Hungarian, English, German, Russian, French, Portuguese, Spanish, Swedish, Italian, Czech, Serbian, Danish, Bulgarian, Croatian, Slovakian, Polish, Romanian, Slovenian, Flemish, Belorussian, Catalan, Dutch, Turkish, Albanian, Ukrainian, Greek, Bosnian, Catalan, Estonian, Finnish, Latvian, Lithuanian, Irish, Maltese, Armenian, Arabic, Hebrew, Thai, Japanese, Chinese, Korean, Vietnamese

1x1 Translations Ltd.

1DayTranslation.com

Phone: +36 70 33 24 905

Email: info@1daytranslation.com

Skype: onebyonetranslation

All rights reserved | 1x1 Translations ©
Website made by: