Facebook could be letting inappropriate pictures of children through moderation, report alleges
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Meta, the parent company of Facebook and Instagram, has reportedly told moderators to “err on the side of an adult” when moderating pictures or videos of young people.
Antigone Davis, head of safety for Meta, told the New York Times that the policy comes from privacy concerns for people who post sexual images of adults. “The sexual abuse of children online is abhorrent,” Ms. Davis said.
However, there are millions of photos and videos moderated as they are uploaded to Facebook, and the company makes 27 million reports of suspected child abuse in 2021. Yet experts still believe that moderators are likely missing some minors.
A training document created for moderators at Accenture, a consulting firm used by Facebook, allegedly says that some mods reportedly “bump up” adolescents to young adults. The Independent has reached out to Accenture for comment.
Content moderators that worked for Meta reportedly said that they encountered sexual images every day that would be affected by this policy and would face negative performance reviews if they made too many erroneous reports.
“They were letting so many things slide that we eventually just didn’t bring things up anymore,” said one former moderator. “They would have some crazy, extravagant excuse like, ‘That blurry portion could be pubic hairs, so we have to err on the side of it being a young adult.’”
Facebook, and other technology companies use the ‘Tanner stages’ to determine the stages of puberty; this is a tool developed by paediatrician Dr James Tanner in the 1960s, but is not designed to determine age. Ms Davis said that this was “ just one factor in estimating age”, however. This could include muscle development or the child’s face.
Companies like Meta must report “apparent” child sexual abuse material, but the law does not define the word “apparent”. Ms. Davis said it was unclear if the law would protect Meta if it erroneously reported an image.
Apple, Snap, which owns Snapchat, and TikTok told the Times that they take the opposite approach of Meta – reporting any sexual image in doubt.
“We report more material to the National Center for Missing and Exploited Children than literally all other tech platforms combined, because we are the most aggressive. What’s needed is for lawmakers to establish a clear and consistent standard for all platforms to follow”, a Meta spokesperson said.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments