YouTube to delete thousands of accounts after it bans supremacists, conspiracy theorists and other ‘harmful’ users

Users that promote ideas such as flat earth and fake cures for illnesses will be hidden

Andrew Griffin
Wednesday 05 June 2019 13:08 EDT
Comments
YouTube to delete thousands of accounts after it bans supremacists, conspiracy theorists and other 'harmful' users

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

YouTube will delete thousands of accounts after banning "supremacists", conspiracy theorists and other harmful accounts, it has claimed.

The decision was made after an in-depth review of its rules on hateful content, YouTube said. While it has always banned hate content in general, the site has allowed some specific kinds of harmful videos – such as those promoting Nazi ideology or claiming 9/11 did not happen – to continue being hosted on the site.

Those videos, as well as other kinds of "supremacist" content, will now be officially banned.

"Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status," it wrote in a blog post.

That is expected to lead to the removal of thousands of accounts as it goes into place, though that could take some time. "We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we’ll be gradually expanding coverage over the next several months," its announcement read.

It did not give any specific examples of accounts that would be removed.

It noted that some of those accounts are useful to researchers, and said it would try and work on ways of making sure they stay available. It also said the change would not affect videos that are discussing "pending legislation, aim to condemn or expose hate, or provide analysis of current events".

It will also alter its algorithm in an attempt to stop certain kinds of misleading and harmful videos, such as those promoting fake miracle cures or the flat Earth hoax, will stop being recommended in YouTube's "up next" sidebar. It will also encourage more authoritative videos to try and discourage people from being tricked by those stories.

It has already trialled the system to do this in the US, and said it has found success. It will bring it to more countries by the end of the year, it said, as well as tuning the algorithm so that it is more efficient and can spot more content, it said.

It also said it would work harder to stop YouTube users promoting harmful content from receiving ad money. Channels that "repeatedly brush up against" its hate speech policies will be suspended from the company's partner programme.

"The openness of YouTube’s platform has helped creativity and access to information thrive," its blog post concluded. "It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence.

"We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come."

The change comes on the same day as the company said it would not remove videos in which one of its stars attacked another user over his sexuality, using a series of anti-gay slurs. Strangely, the company now explicitly bans videos that encourage discrimination or segregation based on sexuality, but made no reference to that high-profile case in its blog post, and did not say that it would change its position.

YouTube has been repeatedly criticised for its relatively lax approach towards various kinds of harmful content, including those on the far-right. That criticism became even more prominent in the wake of the Christchurch shooting, when it and other video sites failed to quickly remove videos of the mass murder.

As such, the site has been repeatedly accused of not only permitting but also encouraging extremism, by playing host to often violent and niche accounts.

But right-wing channels also make up a significant part of YouTube's channels and their viewers. Earlier this year, Bloomberg reported that far-right videos were one of the site's most popular categories.

The decision also comes amid increasing scrutiny from conservative politicians about whether YouTube has a bias against right-wing creators. As with Twitter and Facebook, the company has been criticised for undermining free speech and being unfair towards its conservative users, despite the fact there is no evidence of those accounts being discriminated against.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in