MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple sued over 2022 dropping of CSAM detection features

Sunday December 8, 2024. 10:46 PM , from AppleInsider
Apple sued over 2022 dropping of CSAM detection features
A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.Apple has retained nudity detection in images, but dropped some CSAM protection features in 2022.Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments. Continue Reading on AppleInsider | Discuss on our Forums
https://appleinsider.com/articles/24/12/08/apple-sued-over-2022-dropping-of-csam-detection-features?...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Dec, Thu 12 - 11:25 CET