Navigation
Search
|
Apple Explains Pullback from CSAM Photo-Scanning
Wednesday September 6, 2023. 05:48 PM , from TidBITS
In a letter responding to a child safety group, Apple has outlined its reasons for dropping its proposed scanning for child sexual abuse material in iCloud Photos. Instead, the company is focusing on its Communication Safety technology, which detects nudity in transferred images and videos.
https://tidbits.com/2023/09/06/apple-explains-pullback-from-csam-photo-scanning/
|
46 sources
Current Date
May, Wed 1 - 03:15 CEST
|