Apple Reportedly Working on Problematic iOS Tool to Scan for Child Abuse Photos on iPhones

Apple Reportedly Working on Problematic iOS Tool to Scan for Child Abuse Photos on iPhones

Apple is purportedly poised to announce a new tool that will help identify child abuse in photos on a user’s iPhone. The tool would supposedly use a “neural matching function” to detect if images on a user’s device match known child sexual abuse material (CSAM) fingerprints. While it appears that Apple has taken user privacy into consideration, there are also concerns that the tech may open the door to unintended misuse — particularly when it comes to surveillance.

The news comes via well-known security expert Matthew Green, an associate professor at Johns Hopkins Information Security Institute. The news has yet to be confirmed by Apple, however. That said, Green is a credible source who’s written extensively about Apple’s privacy methods over the years. Notably, he’s worked with Apple in the past to patch a security flaw in iMessage.

“I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea,” Green tweeted in a thread late last night. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.”

The crux of the issue is that while various tech companies, including Apple, have added end-to-end encryption to their services and products, it’s been opposed by various governments. While end-t0-end encryption is a win for consumer privacy, the argument is it also makes it difficult for law enforcement in attempts to crack down on illegal content like child pornography. According to Green, a “compromise” is to use these scanning technologies on the “client-side” or, on your phone before they’re sent and encrypted on the cloud. Green also claims that Apple’s version wouldn’t initially be used on encrypted images — just your iPhone’s photo library if and only if, you have iCloud Backup enabled. In other words, it would only scan photos that are already on Apple’s servers. However, Green also questions why Apple would go through the effort of designing this type of system if it didn’t have eventual plans to use it for end-to-end encrypted content.

No one wants to go to bat for child pornography, but Green points out this tech, while nobly intended, has far-reaching consequences and can potentially be misused. For instance, CSAM fingerprints are purposefully a little vague. That’s because if they were too exacting, you could just crop, resize or otherwise edit an image to evade detection. However, it also means bad actors could make harmless images “match” problematic ones. One example is political campaign posters that could be tagged by authoritarian governments to suppress activists, and so forth.

The other concern is that Apple is setting a precedent, and once that door is open, it’s that much harder to close it.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Green writes. “That’s the message they’re sending to governments, competing services, China, you.”