MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
child
Search

Apple Removes Mention of CSAM Detection on Child Safety Page, Code Remains in iOS

Wednesday December 15, 2021. 03:56 PM , from TheMacObserver
Redditor u/AsuharietYgvar spotted a change on Apple’s page that lists child safety features. The company has removed any mention of its controversial CSAM detection plans in iCloud Photos.
Update: In response to The Verge, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September. Apple plans to move forward with the detection feature and eventually release it.
iOS 15 Child Safety
Apple announced the move in August as a feature coming in a version of iOS 15. It would detect images of child sexual abuse material (CSAM) as they are uploaded to iCloud Photos. It had to meet certain requirements such as if the image hash had been previously uploaded to a database from the National Center for Missing and Exploited Children (NCMEC).
The web page had previously said:
Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Communication safety in Messages has still be launched with iOS 15.2. The Messages app includes tools to warn children when receiving or sending photos that contain nudity. These features are not enabled by default. If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.
Redditor u/AsuharietYgvar had also claimed to have extracted the NeuralHash algorithm used for the CSAM detection. They also claim that the code is still present as of iOS 15.2. It remains unknown whether Apple plans to abandon the plan entirely or release it in a future version of its operating systems.
Tags: featured, iCloud Photos, iOS 15, kids, privacy
https://www.macobserver.com/news/product-news/apple-removes-mention-csam-detection/?utm_source=macob...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
May, Thu 9 - 00:30 CEST