Apple's pause on child porn scanning highlights limits of trust in Silicon Valley

03 September 2021 21:44

Apple

Apple’s attempt to thread the needle between privacy and combatting child exploitation unraveled today with a terse reversal from the smartphone behemoth.

This isn't the first time, nor will it likely be the last, that Apple has run headlong into the problem of how its privacy protections are used by the innocent and criminal alike. But for now, the company said it’s pausing a plan to scan photos kept on mobile devices for matches with sexualized images of minors known to national clearinghouses such as the US’s National Center for Missing and Exploited Children.

Today’s reversal is an unusual about-face for a headstrong company, one that shows how even the most privacy-minded of the Silicon Valley giants walks a thin line on trust.

Only weeks earlier, Apple said the next version of its iOS mobile operating system would check photos synced with its cloud storage service for child exploitation material, triggering backlash among consumers and privacy advocates. The announcement provoked concerns the system could be misused to flag material beside child-abuse images, and that it would invade the privacy of users’ photo libraries.

The company’s attempt to balance competing equities of criminal surveillance and everyday privacy imploded with likely little more to show than a damaged reputation.

Reaction to Apple’s decision to enable device-side scanning may have been particularly harsh given the company’s reputation for resisting pressure to undercut privacy. Apple famously fought the FBI’s 2016 attempt to force it to create a specialized operating system to bypass its own security mechanisms after the law-enforcement agency sought to unlock the iPhone of a California mass shooter and self-proclaimed ISIS terrorist.

Privacy, Apple Chief Executive Tim Cook has said, can be a matter of “life and death.”

Still, it’s one thing to say no to a transparent bid by the FBI to leverage a terrorist incident into a precedent-setting benchmark for police access to encrypted devices. Saying no to fighting child-abuse sexual material is another. Those who have been exposed to such material warn that it’s life-altering. The images can be vile, triggering a strong desire to remove them from the Internet at all costs.

Apple’s emphasis on privacy has led it to become “the greatest platform for distributing child porn,” lamented Apple anti-fraud chief Eric Friedman in a February 2020 text message chat made public as part of an antitrust lawsuit.

Apple’s technical solution could have been worse. The system would have matched hashes — mathematically-derived unique fingerprints — of known child sexual abuse images with hashes of users' images synched with iCloud. Apple says it would encrypt, with two separate cryptography keys, the hashes of iCloud-synced images. Only when it detected 30 known child sexual abuse images in a users’ account would the company be able to unlock the image information. Notably, Apple didn’t promise to undermine encryption and the detection system may even have been a prelude to Apple announcing iCloud end-to-end encryption.

The operating system update was good enough for the National Center for Missing and Exploited Children, which called critics of the system the “screeching voices of the minority” in a leaked memo distributed to Apple staff.

A firestorm still resulted. Among other things, scanning practically every image kept on a mobile device for a connection to child pornography feels invasive. And there’s reason to doubt Friedman’s assertion about Apple’s outsized distribution in child sexual abuse material. Smartphones tend to be closely tied to real identities, while consumers of child pornography often gravitate to free online storage accounts that barely require identity credentials, Stanford University Privacy and Data Fellow Jen King told MLex.

Freidman himself, in his February 2020 text exchange, replied “yes” to a colleague’s assertion that file sharing systems present more opportunities for bad actors.

Online storage providers also scan for prohibited material, but they don’t undertake generalized, mass scanning. “The user has to do something dramatically visible” for their account to be scanned, a former cloud computing executive told MLex, exchanging anonymity for frankness. Anomalous behavior could include a newly created shared folder that suddenly picks up downloads from multiple individuals — a pattern radically different from how most consumers use and share online storage accounts.

Apple’s decision to scan everything for everybody, in short, seemed disproportionate.

Of course, from the perspective of victims, a company with even a relatively small child sexual abuse material problem has too big a problem. It’s here where the problem of competing equities becomes difficult, and questions of trust become salient.

Apple maintains it would never allow its client-side monitoring to be corrupted by, say, an authoritarian country looking to root out images of dissent. Should consumers trust it? Silicon Valley often strikes uncomfortable deals with governments to maintain market access. Not even the US government might be forever immune from the temptations of such monitoring — and once such a system is turned on, it's near impossible to turn off.

The Apple system doesn’t require just trust in Apple, either. Users have to be confident that the hashes established by child sexual abuse material clearinghouses, including the National Center for Missing and Exploited Children, are accurate. Nothing suggests that the center’s library is intentionally or unintentionally misleading, but the organization tends to be secretive about its operations while vocally inserting itself into policy debates about matters such as encryption or website intermediary liability. It’s a combination that doesn’t always inspire confidence from privacy advocates, who, among other things, don’t appreciate being dismissed as a “screeching voice.”

Apple tried to roll out a technical solution that contained privacy-preserving features meant to be broken in case of a terrible crime. It thought it could placate both the privacy and the law enforcement communities.

Today’s retreat, even if just temporary, shows how unlikely a technical solution is that would bridge those polarized opposites, long-accustomed to fighting each other in a zero-sum war of attrition. And it shows how precarious Apple’s position as the resident privacy-friendly company is. Apple consumers want privacy. They don’t want to have to trust Apple when it appears to undermine it.

Related Articles

No results found