MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
rsquo
Search

The facts, fiction, and fantasy of Apple’s child abuse scanning tools

Tuesday August 10, 2021. 12:00 PM , from Mac 911
The Macalope is glad he was off last week because he needed the rest before coming back to this week’s BLAZZLEFROZZLEROGGLE!

Last Thursday, Apple announced that it would be implementing several new technologies in upcoming versions of its operating systems designed to identify and flag child pornography. Sounds like a good idea, right? Well, yes… and also no.

If that sounds like there’s some nuance, you’re right. But at least we can all relax in the knowledge that humans handle nuance super well.

[nervous smile, tugging at collar]

Yeah, the horny one has seen a number of reactions to this ranging from “ALL IS WELL, CITIZEN!” to “THEY TOOK AWAY MY NAME AND GAVE ME A NUMBER!” The Washington Post, which loves to “AH-HA!” Apple, provided a piece that is resplendent in both its flamboyant j’accusosity as well as its numerous inaccuracies. In footnotes to his piece on Apple’s announcement, John Gruber details exactly what’s wrong with The Post’s piece.

It is hilarious to the Macalope that the domain of Apple gotcha-ism has graduated from the Forbes contributor network and red tide clambake to the pages of the paper that took down Nixon. Most recently The Post tried to make hay from a report sample that had more iPhones being infected with spyware than Android devices. The problem being that the reason more iPhones showed up infected was simply that iOS’s logs are better and the researchers cautioned against making relative judgments of the platforms based on their findings.

But before you either have a fit about Apple’s new anti-child pornography technology or rush to Apple’s defense, the Macalope suggests you read Apple’s page on Child Safety, Gruber’s aforementioned post on this topic as well as Ben Thompson’s (subscription) if you can. The Macalope knows knowing things before you opine on them is old-fashioned but just humor him this once, can’t you?

IDG

Short story, though, there are two controversial aspects of this. First, if you store your photos in iCloud, Apple will be comparing their identifiers to a database of identifiers of known child pornography compiled by a government organization tasked with tracking such material. If it finds a number of such images over a certain numerical threshold, Apple will be alerted and look into the situation. The automated system doesn’t look at or scan the images, it just checks identifiers to see if any are known child pornography.

Second, Apple is providing an opt-in option for photos messaged to or from child iOS iCloud accounts. In describing this feature Apple says:

Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.

The concerns about the first technology center around what will happen if a government comes to Apple with its own database that includes images of political dissent or support for “subversive” ideas such as underhanded toiler paper rolling which, while inherently wrong, should not be persecuted? Apple has stated categorically that it will reject any such requests.

The Macalope certainly hopes they stick to that because when you give a dictator knowledge of cookie-stealing technology… he’s gonna wanna steal some cookies.

That technology only examines photos that are stored in iCloud Photos. If you want to opt out, don’t store your photos in iCloud. That’s a pain for many, but at least it’s consistent with Apple’s existing agreement with law enforcement that while it can’t/won’t unlock iPhones, it can/will unlock iCloud backups.

The complaints about the second technology are that while it’s opt-in and currently only targeted at child accounts, it is dangerous to have on-device scanning of images implemented in a way because… things change! Musical styles, tie widths, and, yes, the lengths to which governments will go to get companies to poke into the lives of private citizens.

Given Apple’s commitment to keeping Messages encrypted during transmission, this is the only way to do this. The only other alternative is to simply not scan for these images at all. You can argue that’s what Apple should do. But it’s worth noting that, as Thompson points out, governments are moving toward mandating companies look for child pornography, partly because it’s really one of the worst things imaginable and partly because “Won’t someone think of the children?” really sells with voters.

Protecting children is great. We all want that. What we don’t want is to make it easier for bad people to take advantage of our good nature.

And, as John Gruber notes, it’s possible that Apple is implementing these technologies as a precursor to “providing end-to-end encryption for iCloud Photo Library and iCloud device backups.” In other words, as a way to cut off complaints of law enforcement and legislators when it provides more individual privacy protection. The Macalope would like to believe that’s what’s happening but, at this point, it’s just speculation.

While the Macalope rejects those who have gone from zero to 60 speeding down Outrage Boulevard in their histrionics mobiles over this, it is not wrong at all to be concerned about these new technologies. All this is definitely something we need to keep our eyes on. The bottom line for companies in a capitalist system is the bottom line, and given other decisions Apple has made to comply with anti-privacy laws in foreign states, it’s not unreasonable to be concerned about how this could be perverted. Apple executives surely care about both protecting children and privacy, but despite how we treat companies in this country as individuals, companies do not have souls.

With the possible exception of Dolly Parton Enterprises Inc.
https://www.macworld.com/article/352936/macalope-facts-fiction-apple-child-abuse-scanning-tools-imes...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Sat 20 - 16:34 CEST