Illustration by Alex Castro / The Verge

When Apple introduced its slate of initiatives to prevent the spread of child sexual abuse material, or CSAM, last year, they were controversial, to say the least. While some praised the company for taking action, there was also no shortage of detractors, some of whom said that Appleā€™s plans to do on-device scanning for illegal content would require an unacceptable huge hit to user privacy.

The backlash caused Apple to delay some of the features in September 2021, and earlier this week, the company confirmed it has abandoned its efforts to create the hashing system that wouldā€™ve searched peopleā€™s iCloud photo libraries for illegal materials. We contacted some of the organizations that had spoken out either in support of or against Appleā€™s initiative to see what they had to say now that itā€™s gone.

The National Center for Missing & Exploited Children

The National Center for Missing & Exploited Children, or NCMEC, was going to be one of Appleā€™s partners for its image scanning system, with the center providing both the hashes of known CSAM images and assistance with reviewing anything the system found before contacting the authorities.

As you might imagine, NCMEC isnā€™t particularly pleased with Appleā€™s decision to drop the feature, and the companyā€™s simultaneous announcement of even stronger iCloud privacy measures that will end-to-end encrypt backups doesnā€™t seem to be helping matters. ā€œThe National Center for Missing & Exploited Children opposes privacy measures that ignore the undisputed realities of child sexual exploitation online,ā€ said Michelle DeLaune, the organizationā€™s president and CEO, in a statement to The Verge. The rest of the statement reads:

We support privacy measures to keep personal data secure – yet privacy must be balanced with the reality that countless children are being sexually victimized online every day. End-to-end encryption without a solution in place to detect child sexual exploitation will allow lawless environments to flourish, embolden predators, and leave child victims unprotected.

Proven technology tools exist and have been used successfully for over a decade that allow the detection of child sexual exploitation with surgical precision. In the name of privacy, companies are enabling child sexual exploitation to occur unchecked on their platforms.

NCMEC remains steadfast in calling upon the technology industry, political leaders, and academic and policy experts to come together to agree upon solutions that will achieve consumer privacy while prioritizing child safety.

The Center for Democracy and Technology, the Electronic Frontier Foundation, and Fight for the Future

In August 2021, the Center for Democracy and Technology (CDT) posted an open letter to Apple expressing concern over the companyā€™s plans and calling on it to abandon them. The letter was signed by around 90 organizations, including the CDT. ā€œWeā€™re very excited, and weā€™re counting this as a huge victory for our advocacy on behalf of user security, privacy, and human rights,ā€ said Mallory Knodel, chief technology officer for the organization, talking about Appleā€™s cancellation announcement.

Knodel thinks that Appleā€™s change of heart may have been in part a response to the urging of CDT and others but also because it saw the winds shifting on the topic of client-side scanning. ā€œEarlier this year, Meta had a similar conclusion when they asked for a human rights impact assessment of their possible decision to move towards end-to-end encryption of their messaging platforms, both on Instagram messenger kids and Facebook Messenger,ā€ she said. When the organization conducting the assessment suggested a similar type of scanning, though, Knodel says Meta was ā€œvery, very strong in saying ā€˜under no circumstances are we going to pursue client-side scanning as an option.ā€™ And that, I think, has helped.ā€

Other organizations that signed the original letter echoed some of Knodelā€™s sentiments.

ā€œEncryption is one of the most important tools we have for maintaining privacy and security online,ā€ said Andrew Crocker, senior staff attorney for the Electronic Frontier Foundation. ā€œWe applaud Apple for listening to experts, child advocates, and users who want to protect their most sensitive data.ā€

Meanwhile, Fight for the Futureā€™s Caitlin Seeley George called Appleā€™s announcement on Wednesday ā€œa huge victory,ā€ adding that ā€œon-device scanning of messages and photos would have been incredibly dangerous ā€” Apple would essentially have forced malware on its users, which would go completely against the companyā€™s ā€˜pro-privacyā€™ marketing, would have broken end-to-end encryption, and would not have made anyone safer.ā€

Knodel hinted, however, that the fight isnā€™t necessarily over. ā€œAs people who should be claiming part of this victory, we need to be really loud and excited about it, because you have, both in the EU and in the UK, two really prominent policy proposals to break encryption,ā€ she said, referencing the Chat Control child safety directive and Online Safety Bill. ā€œWith Apple making these strong pro-encryption moves, they might be tipping that debate or they might be provoking it. So Iā€™m sort of on the edge of my seat waiting.ā€

Not all of Appleā€™s child protection plans were scrapped. Parents or guardians can enable a communication safety system for iMessage that can scan photos sent to minors for nudity. However, contrary to Appleā€™s initial announcement, parents arenā€™t automatically alerted if the minor chooses to look at the image. Instead, itā€™s left up to the child as to whether they want to alert their parents, though the system makes it very easy to do so.

By

Leave a Reply

X