MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System

Monday December 9, 2024. 11:28 AM , from MacRumors
Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System
Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual abuse material (CSAM), according to a report from The New York Times.

Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple's failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims.

In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative.

Explaining its decision at the time, Apple said that implementing universal scanning of users' private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption.

The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple's decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma.

In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company's commitment to fighting child exploitation, stating that Apple is 'urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.' Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.Tag: Apple Child Safety FeaturesThis article, "Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System" first appeared on MacRumors.comDiscuss this article in our forums
https://www.macrumors.com/2024/12/09/apple-hit-with-lawsuit-abandoned-cam-detection/

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Dec, Thu 12 - 07:24 CET