Navigation
Search
|
AI Summaries Turn Real News Into Nonsense, BBC Finds
Wednesday February 12, 2025. 11:30 PM , from Slashdot
![]() - 51 percent of all AI answers to questions about the news were judged to have significant issues of some form. - 19 percent of AI answers which cited BBC content introduced factual errors -- incorrect factual statements, numbers, and dates. - 13 percent of the quotes sourced from BBC articles were either altered from the original source or not present in the article cited. But which chatbot performed worst? '34 percent of Gemini, 27 percent of Copilot, 17 percent of Perplexity, and 15 percent of ChatGPT responses were judged to have significant issues with how they represented the BBC content used as a source,' the Beeb reported. 'The most common problems were factual inaccuracies, sourcing, and missing context.' In an accompanying blog post, BBC News and Current Affairs CEO Deborah Turness wrote: 'The price of AI's extraordinary benefits must not be a world where people searching for answers are served distorted, defective content that presents itself as fact. In what can feel like a chaotic world, it surely cannot be right that consumers seeking clarity are met with yet more confusion. 'It's not hard to see how quickly AI's distortion could undermine people's already fragile faith in facts and verified information. We live in troubled times, and how long will it be before an AI-distorted headline causes significant real world harm? The companies developing Gen AI tools are playing with fire.' Training cutoff dates for various models certainly don't help, yet the research lays bare the weaknesses of generative AI in summarizing content. Even with direct access to the information they are being asked about, these assistants still regularly pull 'facts' from thin air. Read more of this story at Slashdot.
https://news.slashdot.org/story/25/02/12/2139233/ai-summaries-turn-real-news-into-nonsense-bbc-finds...
Related News |
25 sources
Current Date
Feb, Thu 13 - 08:24 CET
|