Social media companies 'actively' serve up extremist material to users to maximise profits, MPs say

'The algorithms that you profit from are being used to poison debate,' Yvette Cooper tells YouTube

Lizzie Dearden,Andrew Griffin
Wednesday 24 April 2019 16:36 EDT
Comments
Social media companies 'in denial' over serving up extremist material to users, says MP Tim Loughton

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Social media companies are “actively” pushing their users to consume extremist content in order to drive up profits, MPs have said.

Algorithms used by sites like YouTube to suggest new content to users are recommending inflammatory and radical material, a Home Affairs Committee hearing was told.

One member accused YouTube, Facebook and Twitter of “not giving a damn” about fuelling radicalisation in the wake of the massacres in Sri Lanka and New Zealand.

Following heated exchanges in Wednesday’s evidence hearing, Yvette Cooper said MPs had been “raising the same issues again and again” over several years.

As representatives of YouTube, Facebook and Twitter, outlined action taken against extremist content, MPs provided fresh examples of neo-Nazi, Islamist and far-right posts on their platforms.

MPs took particular aim at YouTube over the way its algorithms promote videos and create playlists for viewers that they accused of becoming increasingly extreme.

The site has been repeatedly criticised for showing a variety of inflammatory comment in the recommendations pane next to videos.

MPs said that could easily radicalise young people who begin watching innocent videos.

Conservative MP Tim Loughton said tests showed a benign search could end with “being signposted to a Nazi sympathiser group”.

He added: “There seems to be a systemic problem that you are actively signposting and promoting extremist sites."

YouTube uses an algorithm to find out related and engaging content, so that users will stay on the site by clicking through videos. It has never revealed the details of that algorithm, which allows YouTube to generate profits by showing more advertising the longer its users stay on the site.

MPs described how that chain of related videos would lead to more and more extreme in content, even if the first video had been relatively innocuous. Ms Cooper described clicking through videos and finding that “with each one the next one being recommended for me was more extreme”, going from right-wing sites to racist and radical accounts.

“The algorithms that you profit from are being used to poison debate,” she said.

Marco Pancini, the director of public policy at YouTube, said the logic behind its algorithms “works for 90 per cent of experience of users on the platform”.

Facebook bans far-right groups such as BNP, EDL and Britain First

But he said the Google-owned firm was “aware of the challenge this represents for breaking news and political speech”, and was working to prioritise authoritative content and reduce the visibility of extremists.

He pointed to the work it has done to prioritise authoritative sources when people are searching for political speech or breaking news. Some of that has led to controversies of its own, such as when YouTube accidentally linked a video of the Notre Dame fire to video of the 9/11 attacks.

Mr Doughty accused YouTube of becoming “accessories to radicalisation” and crime, but Mr Pancini replied: “That is not our intention … we have changed our policies.”

He said the company was working with non-governmental organisations in 27 European countries to improve detection of offensive content.

Labour MP Stephen Doughty said he found links to the websites of “well-known international organisations” and videos calling for the stoning of gay people on YouTube and other platforms.

“Your systems are simply not working and, quite frankly, it's a cesspit,” he added. “It feels like your companies really don't give a damn.

”You give a lot of words, you give a lot of rhetoric, you don't actually take action … all three of you are not doing your jobs.”

Representatives of Facebook, Twitter and YouTube said they had increased efforts against all kinds of extremism, using both automated technology and human moderators.

But the Islamist militant group that carried out church and hotel bombings that left more than 300 people dead in Sri Lanka still has a Twitter account, and its YouTube channel was not deleted until two days after one of the world’s deadliest terror attacks.

A photo released by Isis's Amaq propaganda agency claiming to show suicide bombers who carried out the Sri Lanka attacks
A photo released by Isis's Amaq propaganda agency claiming to show suicide bombers who carried out the Sri Lanka attacks

Ms Cooper pointed to reports that clips of the Christchurch shooter’s Facebook Live video were still circulating and said she had been recommended Tommy Robinson videos on YouTube after a supposed crackdown.

The Labour MP revealed that overnight, she had been alerted to weeks-old posts on a closed Facebook group with 30,000 members that said she and her family should be shot as “criminals”.

“Kill them all, every f***ing one, military coup National Socialism year one – I don’t care as long as they are eradicated,” said another post that remained online.

Ms Cooper accused the social media giants of “providing safe places to hide for individuals and organisations who spread hate”.

“We have taken evidence from your representatives several times over several years, and we feel like we are raising the same issues again and again,” the former shadow home secretary added. “We recognise you have done some additional work but we are coming up time and again with so many examples of where you are failing, where you may be being gamed by extremists or where you are effectively providing a platform for extremism ... very little has changed.”

Police said Finsbury Park terror attacker Darren Osborne was radicalised online
Police said Finsbury Park terror attacker Darren Osborne was radicalised online (Metropolitan Police/Reuters)

The committee previously heard that the far-right extremist who ploughed a van into Muslims in the Finsbury Park terror attack had been radicalised online.

Ms Cooper said material on Facebook, Twitter and YouTube “leads to people being hurt and killed”.

Ms Cooper warned that social media blocks like those implemented by Sri Lanka after the Easter Sunday attacks could be seen elsewhere “if you don’t get your act together”.

The UK government has proposed the creation of an independent regulator to create a code of practice for tech companies, and enforce it with fines and blocks.

Representatives of Facebook, Twitter and YouTube said they supported measures proposed in the Online Harms white paper, which is currently under consultation.

Neil Potts, Facebook’s public policy director, said it now has 30,000 staff working on safety and security, including language and subject matter experts.

But when asked whether people spreading the terrorist propaganda had been reported to police, Mr Potts said proactive referrals were only made to law enforcement when there was an “imminent threat”.

“You are not actually reporting the crimes that your platforms are making possible,” Ms Cooper said. “That is a serious problem.”

Support free-thinking journalism and attend Independent events

Katy Minshall, head of UK public policy at Twitter, said 1.4 million accounts had been removed for promoting terrorism and the social network actively enforces its rules rather than relying on reports.

Twitter has 1,500 people working on policy enforcement and moderation around the world, and is removing more content but is “never going to get a 100 per cent success rate”, she said.

She added: “There is a likely risk in the next few years that the better our tools get, the more users are removed, the more they will migrate to parts of the internet where nobody is looking.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in