Google, Facebook and Twitter face big fines if they don’t remove extremist content within an hour
Remove extremist posts in an hour or face fines, European Commission tells tech giants
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Google, Twitter and Facebook could face fines from the EU if they fail to take down extremist content within one hour, the European Commission president has warned.
“One hour is the decisive time window the greatest damage takes place,” Jean-Claude Juncker said in his annual State of the Union address to the European Parliament.
Technology giants have come under sustained criticism for not doing enough to prevent material linked to terrorists, racists and other violent groups circulating on their platforms.
In March, the companies were given three months to demonstrate that they were acting faster to take down offending material, but EU regulators say too little has been done.
Under the Commission’s new proposal, which will need backing from EU member states and the European Parliament, internet firms could face fines fo up to 4 per cent of turnover if they fail to take down offending material within an hour of being notified of its existence.
The draft rules will demand the national governments have the capacity to identify extremist content online, and put in place sanctions and an appeals procedure.
Google, Facebook and Twitter have all pledged to crack down on extremist material after public outcry and calls from politicians.
In December, the chair of the Commons Home Affairs Committee accused Facebook, Twitter and YouTube of creating a “bubble of hate” with their algorithms.
Labour MP Yvette Cooper said police were “extremely worried” about the role of technology in extremism and online grooming.
“Your algorithms are doing that grooming and that radicalisation because once people go on one slightly dodgy thing, you are linking them to an awful lot of other similar things,” she told a hearing in the Commons.
The Counter-Terrorism Internet Referral Unit (CTIRU) has instigated the removal of over 300,000 online videos, web pages and posts in total since it was established in 2010.
The vast majority related to Islamic extremism but the number of reports about far-right-wing material online has surged over the last year, CITRU said.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments