MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple’s controversial iCloud Photos CSAM scanning scrubbed from site

Wednesday December 15, 2021. 02:15 PM , from Mac 911
Earlier this year, Apple announced a new system designed to catch potential CSAM (Child Sexual Abuse Material) by scanning iPhone users’ photos. After an instant uproar, Apple delayed the system until later in 2021, and now it seems like it might not arrive for a while longer, if at all.

Just days after releasing the Messages component of its multi-pronged child-safety approach in iOS 15.2, Apple has removed all references to the CSAM scanning tech on Apple.com. As spotted by Macrumors, the previous Child Safety webpage now leads to a support page for the Communication safety in Messages feature. Apple says the feature is still “delayed” and not canceled, though it will clearly miss its self imposed the 2021 deadline.

Apple’s CSAM detection announcement generated controversy almost as soon as it announced. The system as described scans users’ iPhones for images for recognizable hashes in the National Center for Missing and Exploited Children’s database, which are then checked on a list of known CSAM hashes. If a match is made, the photo is reviewed by a person at Apple after it is uploaded to iCloud, and if it indeed contains CSAM, the person who uploaded it would then be referred to the appropriate authorities.

Arguments made against this feature are mainly centered around the idea that it could be implemented for other uses. For example, a government could demand that Apple create a similar process to check for images deemed determinantal to the government’s policies. Some also were concerned that the scanning is being done on the iPhone itself, though results aren’t delivered until photos are uploaded to iCloud.  

Apple launched a new Messages feature in iOS 15.2 that can warn children and parents when receiving or sending photos that contain nudity, which was part of the original announcement. Unlike the proposed CSAM scanning, the feature is off by default and parents need to explicitly opt in as part of Family Sharing. Siri and search will also warn people when attempting to find potential CSAM.

Update 12:10pm ET: Apple says the feature is still delayed but not canceled.
https://www.macworld.com/article/559731/apple-csam-icloud-photo-scanning-removed.html
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Fri 19 - 12:23 CEST