More than 90 policy groups call for Apple to abandon plan to scan images on iPhones

Andrew Griffin
Thursday 19 August 2021 06:14 EDT
Comments
(AFP via Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Apple has been urged to drop its plans to scan iPhones for child sexual abuse material, in a new open letter signed by more than 90 policy and rights groups.

Earlier this month, the company announced that it would add a feature to the iPhone that scanned images for known child sexual abuse material, or CSAM, when they were uploaded to its servers, and alert the company if it was found. It also said that phones would use artificial intelligence to spot when children were exchanging pictures that appeared to feature nudity, and alert their parents.

Apple has said that the feature is required in the face of the vast amount of abuse imagery that circulates online. It has said it is built with privacy in mind, including tools that ensure the analysis happens on users’ phones rather than in the cloud.

But privacy activists and security experts have raised concerns that the feature could be misused. It could be used to scan for other kinds of images, experts have suggested, alongside other potential problems.

Those issues were highlighted in the open letter, which asks Apple to stop its plans to implement the feature in an upcoming update to the operating system powering the iPhone and iPad.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the letter, which was first reported by Reuters.

The largest campaign to date over an encryption issue at a single company was organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT).

Some overseas signatories in particular are worried about the impact of the changes in nations with different legal systems, including some already hosting heated fights over encryption and privacy.

“It’s so disappointing and upsetting that Apple is doing this, because they have been a staunch ally in defending encryption in the past,” said Sharon Bradford Franklin, co-director of CDT’s Security & Surveillance Project.

An Apple spokesman said the company had addressed privacy and security concerns in a document Friday outlining why the complex architecture of the scanning software should resist attempts to subvert it.

Those signing included multiple groups in Brazil, where courts have repeatedly blocked Facebook’s WhatsApp for failing to decrypt messages in criminal probes, and the senate has passed a bill that would require traceability of messages, which would require somehow marking their content. A similar law was passed in India this year.

“€œOur main concern is the consequence of this mechanism, how this could be extended to other situations and other companies,”€ said Flavio Wagner, president of the independent Brazil chapter of the Internet Society, which signed. “€œThis represents a serious weakening of encryption.—

Other signers were in India, Mexico, Germany, Argentina, Ghana and Tanzania.

Surprised by the earlier outcry following its announcement two weeks ago, Apple has offered a series of explanations and documents to argue that the risks of false detections are low.

Apple said it would refuse demands to expand the image-detection system beyond pictures of children flagged by clearinghouses in multiple jurisdictions, though it has not said it would pull out of a market rather than obeying a court order.

Though most of the objections so far have been over device-scanning, the coalition’s letter also faults a change to iMessage in family accounts, which would try to identify and blur nudity in children’s messages, letting them view it only if parents are notified.

The signers said the step could endanger children in intolerant homes or those seeking educational material. More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.

“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” the letter says.

Other groups that signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Additional reporting by Reuters

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in