Navigation
Search
|
Researchers Say AI Tool Used in Hospitals Invents Things No One Ever Said
Monday October 28, 2024. 04:25 PM , from Slashdot
Those experts said some of the invented text -- known in the industry as hallucinations -- can include racial commentary, violent rhetoric and even imagined medical treatments. Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos. It's impossible to compare Nabla's AI-generated transcript to the original recording because Nabla's tool erases the original audio for 'data safety reasons,' Nabla's chief technology officer Martin Raison said. Read more of this story at Slashdot.
https://tech.slashdot.org/story/24/10/28/1510255/researchers-say-ai-tool-used-in-hospitals-invents-t...
Related News |
25 sources
Current Date
Nov, Thu 21 - 15:31 CET
|