Facebook delays rollout of encrypted messaging amid child safety fears

Meta said it is taking the time to ‘get this right’

Katy Clifton
Saturday 20 November 2021 18:57 EST
Comments
Facebook’s encryption plan has previously sparked warnings that it threatens children’s safety online
Facebook’s encryption plan has previously sparked warnings that it threatens children’s safety online (REUTERS)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook has delayed a rollout of encrypted messaging amid fears such a move could put children at greater risk of exploitation and abuse.

Meta, which owns the social networking giant as well as the messaging service WhatApp, said it is taking time to “get this right” and pledged to work to strike a balance between privacy and safety online.

The company previously said it was aiming to rollout encrypted messaging in its Messenger and Instagram apps at one stage in 2022, but has now said it will not be complete until 2023.

End-to-end encryption hides messages from everyone except those in a conversation, and has previously sparked warnings that it threatens children’s safety online.

Antigone Davis, the boss of safety at Meta, said it will keep working with experts to tackle abuse, but insisted that in previous cases the firm was still in a position to help authorities despite services being encrypted.

Writing in the Sunday Telegraph, she said: “Our recent review of some historic cases showed that we would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted.

“While no systems are perfect, this shows that we can continue to stop criminals and support law enforcement.

“We’ll continue engaging with outside experts and developing effective solutions to combat abuse because our work in this area is never done. We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023.”

The social network’s other messaging platform, WhatsApp, is already fully encrypted.

Last year, the then children’s commissioner for England Anne Longfield said plans by social media firms for more encryption in messaging services would place children at greater risk by making it impossible for platforms to monitor content and prevent the police from gathering potentially vital evidence of child sexual exploitation.

Facebook’s plans for encryption have also been previously criticised by the Government, with Home Secretary Priti Patel warning it puts children at risk and offers a hiding place for abusers and other criminals.

Ms Davis said the firm is “determined to protect people’s private communications and keep people safe online”, but added that “people shouldn’t have to choose between privacy and safety”.

She added that the firm is “building strong safety measures into our plans and engaging with privacy and safety experts, civil society and governments to make sure we get this right”.

The company set out its “three-pronged approach” which it said involves preventing harm, giving people more control, and quickly responding if something happens.

It comes a day after the head of Ofcom reportedly called for social media companies to face sanctions if they do not prevent adults from directly messaging children.

The communications watchdog will regulate the sector under the Online Harms Bill and has the power to fine companies and block access to sites.

The Times reported Dame Melanie Dawes will encourage the regulator to closely examine direct messaging when the new regulations are introduced in 2023.

Speaking about the industry and the bill, Dame Melanie said: “I don’t think it’s sustainable for them to carry on as we are. Something’s got to change.

“What regulation offers them is a way to have consistency across the industry, to persuade users that they’re putting things right, and to prevent what could be a real erosion of public trust.

“They really need to persuade us that they understand who’s actually using their platforms, and that they are designing for the reality of their users and not just the older age group that they all say they have in their terms and conditions.”

Ms Davis said the firm is “determined to protect people’s private communications”
Ms Davis said the firm is “determined to protect people’s private communications” (Getty)

In her piece, which was written before Dame Melanie’s comments, Ms Davis said Facebook was protecting under-18s on its site with measures including defaulting them into private or “friends only” accounts and restricting adults from messaging them if they are not already connected.

The proposals in the Online Harms Bill include punishments for non-compliant firms such as large fines of up to £18 million or 10% of their global turnover – whichever is higher.

In August, Instagram announced it would require all users to provide their date of birth, while Google has introduced a raft of privacy changes for children who use its search engine and YouTube platform.

TikTok also began limiting the direct messaging abilities of accounts belonging to 16 and 17-year-olds, as well as offering advice to parents and caregivers on how to support teenagers when they sign up.

Andy Burrows, head of child safety online policy at the NSPCC, said: “Facebook is right not to proceed with end-to-end encryption until it has a proper plan to prevent child abuse going undetected on its platforms. But they should only go ahead with these measures when they can demonstrate they have the technology in place that will ensure children will be at no greater risk of abuse.

“More than 18 months after an NSPCC-led a global coalition of 130 child protection organisations raised the alarm over the danger of end-to-end encryption Facebook must now show they are serious about the child safety risks and not just playing for time while they weather difficult headlines.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in