Illustration by Alex Castro / The Verge
When Apple introduced its slate of initiatives to prevent the spread of child sexual abuse material, or CSAM, last year, they were controversial, to say the least. While some praised the company for taking action, there was also no shortage of detractors, some of whom said that Appleās plans to do on-device scanning for illegal content would require an unacceptable huge hit to user privacy.
The backlash caused Apple to delay some of the features in September 2021, and earlier this week, the company confirmed it has abandoned its efforts to create the hashing system that wouldāve searched peopleās iCloud photo libraries for illegal materials. We contacted some of the organizations that had spoken out either in support of or against Appleās initiative to see what they had to say now that itās gone.
The National Center for Missing & Exploited Children
The National Center for Missing & Exploited Children, or NCMEC, was going to be one of Appleās partners for its image scanning system, with the center providing both the hashes of known CSAM images and assistance with reviewing anything the system found before contacting the authorities.
As you might imagine, NCMEC isnāt particularly pleased with Appleās decision to drop the feature, and the companyās simultaneous announcement of even stronger iCloud privacy measures that will end-to-end encrypt backups doesnāt seem to be helping matters. āThe National Center for Missing & Exploited Children opposes privacy measures that ignore the undisputed realities of child sexual exploitation online,ā said Michelle DeLaune, the organizationās president and CEO, in a statement to The Verge. The rest of the statement reads:
We support privacy measures to keep personal data secure – yet privacy must be balanced with the reality that countless children are being sexually victimized online every day. End-to-end encryption without a solution in place to detect child sexual exploitation will allow lawless environments to flourish, embolden predators, and leave child victims unprotected.
Proven technology tools exist and have been used successfully for over a decade that allow the detection of child sexual exploitation with surgical precision. In the name of privacy, companies are enabling child sexual exploitation to occur unchecked on their platforms.
NCMEC remains steadfast in calling upon the technology industry, political leaders, and academic and policy experts to come together to agree upon solutions that will achieve consumer privacy while prioritizing child safety.
The Center for Democracy and Technology, the Electronic Frontier Foundation, and Fight for the Future
In August 2021, the Center for Democracy and Technology (CDT) posted an open letter to Apple expressing concern over the companyās plans and calling on it to abandon them. The letter was signed by around 90 organizations, including the CDT. āWeāre very excited, and weāre counting this as a huge victory for our advocacy on behalf of user security, privacy, and human rights,ā said Mallory Knodel, chief technology officer for the organization, talking about Appleās cancellation announcement.
Knodel thinks that Appleās change of heart may have been in part a response to the urging of CDT and others but also because it saw the winds shifting on the topic of client-side scanning. āEarlier this year, Meta had a similar conclusion when they asked for a human rights impact assessment of their possible decision to move towards end-to-end encryption of their messaging platforms, both on Instagram messenger kids and Facebook Messenger,ā she said. When the organization conducting the assessment suggested a similar type of scanning, though, Knodel says Meta was āvery, very strong in saying āunder no circumstances are we going to pursue client-side scanning as an option.ā And that, I think, has helped.ā
Other organizations that signed the original letter echoed some of Knodelās sentiments.
āEncryption is one of the most important tools we have for maintaining privacy and security online,ā said Andrew Crocker, senior staff attorney for the Electronic Frontier Foundation. āWe applaud Apple for listening to experts, child advocates, and users who want to protect their most sensitive data.ā
Meanwhile, Fight for the Futureās Caitlin Seeley George called Appleās announcement on Wednesday āa huge victory,ā adding that āon-device scanning of messages and photos would have been incredibly dangerous ā Apple would essentially have forced malware on its users, which would go completely against the companyās āpro-privacyā marketing, would have broken end-to-end encryption, and would not have made anyone safer.ā
Knodel hinted, however, that the fight isnāt necessarily over. āAs people who should be claiming part of this victory, we need to be really loud and excited about it, because you have, both in the EU and in the UK, two really prominent policy proposals to break encryption,ā she said, referencing the Chat Control child safety directive and Online Safety Bill. āWith Apple making these strong pro-encryption moves, they might be tipping that debate or they might be provoking it. So Iām sort of on the edge of my seat waiting.ā
Not all of Appleās child protection plans were scrapped. Parents or guardians can enable a communication safety system for iMessage that can scan photos sent to minors for nudity. However, contrary to Appleās initial announcement, parents arenāt automatically alerted if the minor chooses to look at the image. Instead, itās left up to the child as to whether they want to alert their parents, though the system makes it very easy to do so.