Navigation
Search
|
Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System
Monday December 9, 2024. 11:28 AM , from MacRumors
Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims. In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative. Explaining its decision at the time, Apple said that implementing universal scanning of users' private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption. The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple's decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma. In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company's commitment to fighting child exploitation, stating that Apple is 'urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.' Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.Tag: Apple Child Safety FeaturesThis article, "Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System" first appeared on MacRumors.comDiscuss this article in our forums
https://www.macrumors.com/2024/12/09/apple-hit-with-lawsuit-abandoned-cam-detection/
Related News |
46 sources
Current Date
Dec, Thu 12 - 07:24 CET
|