TikTok’s algorithm misreads creator’s pro-Black profile as a threat
The error occurred on TikTok’s beta Creator Marketplace, which links content makers with sponsors
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.TikTok blocked users of its Creator Marketplace from using the word “black” and phrases like “Black Lives Matter” in their bios, as the algorithm flagged them as “inappropriate content”.
Creator Ziggi Tyler discovered the issue attempting to update his bio; the words “Black,” “Black Lives Matter,” “Black people," “Black success,” “Pro-Black,” and “I am a Black man” were not accepted. “Pro-white” and “supporting white supremacy” were accepted by TikTok’s algorithms without issue.
TikTok’s Creator Marketplace is currently in invite-only beta testing, but aims to connect creators with brands for sponsorship deals.
TikTok said that the app mistakenly flagged phrases because its hate speech detector associated the words “black” and “audience” – which contains the word “die”.
“Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order,” a TikTok spokesperson said in a statement.
“We recognize and apologize for how frustrating this was to experience, and our team has fixed this significant error. To be clear, Black Lives Matter does not violate our policies and currently has over 27B views on our platform."
The issue is the latest in a series of examples of automated systems working against minorities. Instagram’s CEO Adam Mosseri said in June 2020 that the company needed to better support the black community, and is looking into how its “policies, tools, and processes impact black people”, including its own algorithmic bias.
Algorithmic censorship also saw posts from Palestinians about violence in Gaza taken down on Facebook, Instagram, and Twitter, and led to criticism over the black-box nature of these systems.
Outside of social media other algorithms, including facial recognition algorithms, routinely fail to properly identify the faces of people of colour – who are already targeted disproportionately by police. In February 2019, Nijeer Parks spent 10 days in jail and paid $5000 (£3627) to defend himself after being misidentified by facial recognition software and subsequently arrested by police.
“Regardless of what the algorithm is and how it picked up, somebody had to program that algorithm,” Tyler told Recode. “And if [the problem] is the algorithm, and the marketplace has been available since [2020], why wasn’t this a conversation you had with your team, knowing there have been racial controversies?”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments