Meta scrambles to fix Instagram algorithm connecting ‘vast paedophile network’

‘Suggested for you’ feature reportedly linked users to child sexual abuse material

Anthony Cuthbertson
Thursday 08 June 2023 06:13 EDT
Comments
Instagram’s logo displayed on a tablet screen on 18 October, 2021
Instagram’s logo displayed on a tablet screen on 18 October, 2021 (Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Meta has launched an investigation into reports that Instagram is promoting child sexual abuse material through its algorithm.

Facebook’s parent company set up a taskforce to investigate the claims after the Stanford Internet Observatory (SIO) said it found “large-scale communities” sharing paedophilia content on the platform.

The SIO said it discovered the child sexual abuse material (CSAM) following a tip from the Wall Street Journal, whose report on Wednesday detailed how Instagram’s recommendation algorithm helped connect a “vast pedophile network” of sellers and buyers of illegal material.

Instagram’s ‘suggested for you’ feature also linked users to off-platform content sites, according to the report, with the SIO describing the site as “currently the most important platform” for these networks.

“Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers,” Stanford’s Cyber Policy Center wrote in a blog post.

“Instagram’s popularity and user-friendly interface make it a preferred option for these activities.”

Instagram users were able to find child abuse content through explicit hashtags like #pedowhore, which have since been blocked by Instagram.

“Child exploitation is a horrific crime,” a Meta spokesperson said.

“We’re continuously investigating ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them.”

Meta said that it had already destroyed 27 paedophile networks over the past two years on Instagram, as well as removed 490,000 accounts violating child safety policies in January alone.

Other social media platforms hosting this type of content were also identified by the SIO, though to a much lesser extent.

The SIO called for an industry-wide initiative to limit production, discovery and distribution of CSAM, while also urging companies to devote more resources to proactively identifying and stopping abuse.

“Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers,” the organisation said.

“SIO hopes that this research aids industry and non-profits in their efforts to remove child sexual abuse material from the internet.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in