Navigation
Search
|
Researchers Say AI Tool Used in Hospitals Invents Things No One Ever Said
Monday October 28, 2024. 04:25 PM , from Slashdot
AmiMoJo shares a report: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near 'human level robustness and accuracy.' But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers.
Those experts said some of the invented text -- known in the industry as hallucinations -- can include racial commentary, violent rhetoric and even imagined medical treatments. Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos. It's impossible to compare Nabla's AI-generated transcript to the original recording because Nabla's tool erases the original audio for 'data safety reasons,' Nabla's chief technology officer Martin Raison said. Read more of this story at Slashdot.
https://tech.slashdot.org/story/24/10/28/1510255/researchers-say-ai-tool-used-in-hospitals-invents-t...
Related News |
25 sources
Current Date
Dec, Sun 22 - 14:38 CET
|