Apple’s Controversial CSAM Photo Detection Feature May Be Toast

Apple’s Controversial CSAM Photo Detection Feature May Be Toast

Months after a bungled announcement of a controversial new feature designed to scan iPhones for potential child sexual abuse material (CSAM), Apple has covertly wiped any mention of the plan from the Child Safety page on its website.

The change, first spotted by MacRumors, comes after Apple’s August announcement of a planned suite of features designed to combat the spread of CSAM. But the on-device CSAM detection feature stood out among the other planned additions as a particular concern, with security researchers, policy groups, and regular-old Apple customers alike balking at the plan’s potential to erode privacy.

The CSAM detection feature was designed to utilise a neural matching function called NeuralHash, which would ostensibly have scanned users’ photos for unique hashes — sort of like digital fingerprints — that matched a large database of CSAM imagery that has been compiled by the National Centre for Missing and Exploited Children (NCMEC). If a user’s iPhone was flagged for containing such images, the case would be kicked over to humans, who would presumably get law enforcement involved.

But critics had argued that giving Apple the ability to trawl users’ private data was problematic for a number of reasons, both in terms of its ability to misidentify CSAM (would a photo of your child in the bathtub land you on an FBI watchlist?) and its potential to open up the door to a dangerous surveillance precedent.

Apple, for its part, was dogged in its early attempts to allay fears about the planned feature, trotting out senior executives to do interviews with the Wall Street Journal on how the plan was actually “an advancement of the state of the art in privacy” and releasing a slew of press materials meant to explain away any concerns. But when those efforts did nothing to quell the public outcry over the feature, Apple announced in September that it was making the rare decision to walk back the plans in order to fine-tune them before public release.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple told Gizmodo at the time.

Indeed, although the newly-launched iOS 15.2 does contain some of the original features of the Child Safety initiative — including updates to Siri, Spotlight, and Safari that include new safety warnings for children to help them stay out of danger while surfing the web — the CSAM photo detection feature is nowhere to be found. And if Apple’s quiet retreat from any mention of the feature on its website is any indication, it might be safe to assume that it’ll be a while — if ever — before we see it deployed on our devices.


Editor’s Note: Release dates within this article are based in the U.S., but will be updated with local Australian dates as soon as we know more.