MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Search

Apple delivers on-device nude image detection to Macs, iPhones, and iPads

Monday December 13, 2021. 11:29 PM , from Mac Daily News
Apple released macOS Monterey 12.1, iOS 15.2 and iPadOS 15.2 on Monday, introducing a previously announced opt-in communication safety feature for Messages that scans photos in order to warn children, but not their parents, when children receive or send photos containing nudity.

MacDailyNews Note: This iMessage feature is not the untenable (delayed, not canceled) backdoor surveillance system introduced via the trojan horse of “detecting child sexual abuse material (CSAM).”
Mark Gurman for Bloomberg News:

Apple had attempted to launch a trio of new features geared toward protecting children earlier this year: the Messages feature, new options in Siri for learning how to report child abuse, and technology that would detect CSAM (child sexual abuse material) in iCloud photos. But the approach drew outcry from privacy experts, and the rollout was delayed.
Now Apple is delivering the first two features in iOS 15.2, and there’s no word when the CSAM detection function will reappear.
The image detection works like this: Child-owned iPhones, iPads and Macs will analyze incoming and outgoing images received and sent through the Messages app to detect nudity. If the system finds a nude image, the picture will appear blurred, and the child will be warned before viewing it. If children attempt to send a nude image, they will also be warned.
In both instances, the child will have the ability to contact a parent through the Messages app about the situation, but parents won’t automatically receive a notification. That’s a change from the initial approach announced earlier this year.
In order for the feature to work, parents need to enable it on a family-sharing account.
Some privacy advocates have panned Apple’s child safety features, saying that the technology could be used by governments to surveil citizens. But the opt-in nature and on-device processing for this feature could quell such concerns—at least for now.

MacDailyNews Take: While the opt-in element for on-device nude image detection is certainly welcome, how much of a leap would it be to enable scanning for any user, not just children, and for content other than nudity?
How much further of a leap would it be for those like Apple who abide by local laws — regardless of the law in question — to be legally forced by governments to send them clandestine notifications whenever “illegal” content is discovered on/by a user’s device?
Imagine China scanning for Winnie-the-Pooh on devices in an effort to weed out critics of Xi Jinping (and/or lovers of anthropomorphic teddy bears with a honey fetish). In 2017, a list of thousands of images, including those depicting Vladimir Putin in full makeup, were outlawed in Russia. Extrapolate.
Further, given Apple’s newfound ability to read text in images and convert that to actual text data, any bastardizations of this innocuous-sounding opt-in on-device Messages photo scanning could quickly become the stuff of dystopian nightmares.

Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!

The post Apple delivers on-device nude image detection to Macs, iPhones, and iPads appeared first on MacDailyNews.
https://macdailynews.com/2021/12/13/apple-delivers-on-device-nude-image-detection-to-macs-iphones-an...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Thu 25 - 10:00 CEST