Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Google, Facebook and Twitter face big fines if they don’t remove extremist content within an hour

Remove extremist posts in an hour or face fines, European Commission tells tech giants

Ben Chapman
Wednesday 12 September 2018 06:24 EDT
Comments
'One hour is the decisive time window the greatest damage takes place,' Jean-Claude Juncker said in his annual State of the Union address to the European Parliament
'One hour is the decisive time window the greatest damage takes place,' Jean-Claude Juncker said in his annual State of the Union address to the European Parliament (Reuters)

Your support helps us to tell the story

As your White House correspondent, I ask the tough questions and seek the answers that matter.

Your support enables me to be in the room, pressing for transparency and accountability. Without your contributions, we wouldn't have the resources to challenge those in power.

Your donation makes it possible for us to keep doing this important work, keeping you informed every step of the way to the November election

Head shot of Andrew Feinberg

Andrew Feinberg

White House Correspondent

Google, Twitter and Facebook could face fines from the EU if they fail to take down extremist content within one hour, the European Commission president has warned.

“One hour is the decisive time window the greatest damage takes place,” Jean-Claude Juncker said in his annual State of the Union address to the European Parliament.

Technology giants have come under sustained criticism for not doing enough to prevent material linked to terrorists, racists and other violent groups circulating on their platforms.

In March, the companies were given three months to demonstrate that they were acting faster to take down offending material, but EU regulators say too little has been done.

Under the Commission’s new proposal, which will need backing from EU member states and the European Parliament, internet firms could face fines fo up to 4 per cent of turnover if they fail to take down offending material within an hour of being notified of its existence.

The draft rules will demand the national governments have the capacity to identify extremist content online, and put in place sanctions and an appeals procedure.

Google, Facebook and Twitter have all pledged to crack down on extremist material after public outcry and calls from politicians.

In December, the chair of the Commons Home Affairs Committee accused Facebook, Twitter and YouTube of creating a “bubble of hate” with their algorithms.

Labour MP Yvette Cooper said police were “extremely worried” about the role of technology in extremism and online grooming.

“Your algorithms are doing that grooming and that radicalisation because once people go on one slightly dodgy thing, you are linking them to an awful lot of other similar things,” she told a hearing in the Commons.

The Counter-Terrorism Internet Referral Unit (CTIRU) has instigated the removal of over 300,000 online videos, web pages and posts in total since it was established in 2010.

The vast majority related to Islamic extremism but the number of reports about far-right-wing material online has surged over the last year, CITRU said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in