TikTok moderators see ‘death, graphic pornography, and nude children every day’, lawsuit alleges

‘People like us have to filter out the unsavory content. Somebody has to suffer and see this stuff so nobody else has to’, one TikTok moderator alleged

Adam Smith
Friday 25 March 2022 13:02 EDT
Comments
Britain Tech Regulation
Britain Tech Regulation (Copyright 2020 The Associated Press. All rights reserved)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A pair of former TikTok moderators are suing the viral video company for failing to support them as they removed banned content from the app.

Ashley Velez and Reece Young allege that TikTok did not provide them with adequate mental health support despite the content they had to watch.

"We would see death and graphic, graphic pornography. I would see nude underage children every day," Ms Velez told NPR.

"I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight."

Ms Velez worked for TikTok between May and November 2021, part of the 10,000-strong content moderation team that manages the app worldwide.

"Underage nude children was the plethora of what I saw," said Ms Velez. "People like us have to filter out the unsavory content. Somebody has to suffer and see this stuff so nobody else has to."

Moderators at TikTok must reportedly review videos "for no longer than 25 seconds" before deciding whether they are acceptable. They have to maintain an 80 per cent accuracy rate but, to meet quotas, moderators watch multiple videos simultaneously, it is alleged.

The pair were allowed two 15 minutes break per day and one hour for lunch during a 12-hour shift. Longer breaks, the lawsuit alleges, would result in losing pay.

Trauma is also rooted, the lawsuit goes on, in the number of conspiracy theories the moderators have to watch – on topics ranging from the COVID-19 pandemic to Holocaust denial.

TikTok declined to comment to The Independent but told NPR, which first reported on the lawsuit, that it strives to promote a caring working environment for our employees and contractors” and that moderators are offered "a range of wellness services so that moderators feel supported mentally and emotionally."

TikTok is not the only company that has been criticised for not supporting its moderators adequately.

Facebook content moderators, under the stress of constantly reviewing graphic and offensive material, reportedly cope by using drugs and having sex while on the job.

Much like at TikTok, these moderators are allotted two 15-minute breaks, one 30-minute lunch break, and nine minutes of “wellness time” during one work day.

Rather than being employed by Facebook or its parent company Meta, these moderators are hired by outside contractors who do not receive the attention that other employees do.

A Facebook spokesperson released a statement in response to the report: "We value the hard work of content reviewers and have certain standards around their well-being and support,” the statement read.

“We work with only highly reputable global partners that have standards for their workforce, and we jointly enforce these standards with regular touch points to ensure the work environment is safe and supportive, and that the most appropriate resources are in place."

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in