Should Twitter be held responsible for an Isis terror attack?

No matter what the court rules, it’s clear many are dissatisfied with role of social media in public life

Josh Marcus
San Francisco
Tuesday 21 February 2023 18:48 EST
Former Twitter employee says Donald Trump asked for Chrissy Teigen's tweet to be removed

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

What do social media networks owe the victims of terrorism? If extremists use social media to build a following and inspire attackers, are tech companies responsible when violence breaks out? Should companies like Twitter, Google, and Facebook act proactively to remove potential terrorist content, or wait until they get specific complaints on pending attacks?

These are the thorny questions the US Supreme Court will take up this week in oral arguments for a pair of related cases.

Wherever you, and the court, end up coming down on this conundrum, the forthcoming decisions are well worth your attention. The cases may concern terrorism, but they will impact how online speech works for millions.

In Twitter, Inc. v Taamneh, which will be argued at the high court on Wednesday, relatives of a victim of a 2017 Isis attack in Turkey sued Twitter, Google, and Facebook under the Anti-Terrorism Act (ATA), which lets US nationals harmed by international terrorism seek damages from someone who “aids and abets” extremist violence by “knowingly providing substantial assistance.”

Family members of Nawras Alassaf, whom IS killed in an Istanbul nightclub, argued “the explosive growth of ISIS over the last few years into the most feared terrorist group in the world would not have been possible” without the companies. This unfortunate symbiotic relationship was made well known to these tech firms by an “avalanche of public information,” they emphasise, in the form of news reports, congressional testimony, and high-level warnings from US government officials to social media networks.

Initially, a lower court rejected the suit, finding the family failed to state a claim, but the 9th Circuit Court of Appeals found the family of Alassaf had plausibly argued social media firms were aiding and abetting the general rise of Isis, even though the family acknowledged the companies’ terms of use forbid terrorism and there wasn’t a specific post the relatives were taking issue with. The tech companies appealed the case to the Supreme Court, which took it up last fall.

Social media is a strange beast: part personal diary, part news platform, part quasi public utility, perfectly suited as a recruiting and marketing tool for everyone from employers to influencers to extremists. It is this very amorphousness and ubiquity that makes it so hard to decide, legally, when to hold tech companies accountable.

The social media firms argue that failing to proactively restrict the speech of some of their millions of global users is far from the same thing as knowingly giving substantial assistance to terrorists. The Biden Administration, in a friend-of-the-court brief, made similar arguments.

Meanwhile, a coalition of advocacy groups including the Knight First Amendment Institute, the ACLU, the Electronic Frontier Foundation, and others, argue in a brief of their own that holding tech companies to account for terrorism in this way would inadvertently but “substantially narrow the speech that platforms host, raising serious First Amendment concerns.”

They point to a number of potential impacts of a ruling in favour of the family, including tech companies pre-screening everyone’s posts to rule out offending content, journalists and activists being flagged for perfectly reasonable posts about terror groups, or over-the-top content moderation algorithms shutting down online discussion to avoid any whiff of liability.

A related case, Gonzalez v. Google, also from family members of terror victims, asks similar questions, under a different law, Section 230 of 1996 Communications Decency Act, which courts have interpreted to shield tech companies from liability relating to user content on social media sites, and companies have used to block numerous lawsuits. This is the first time Section 230 has been taken up by the Supreme Court.

No matter what the conservative-leaning court decides is the correct interpretation of these laws, we must not lose sight of the moral issue under view here: what does justice for these families actually look like, and how do we balance their individual rights with our collective right as Americans to a free-flowing public discourse?

Even if Section 230 is found to shield these companies from scrutiny, or the ATA is deemed an inappropriate claim, it is a fact that all across the political spectrum, many feel as though social media companies are causing damage to public life while avoiding consequences.

How often, as is the case in the Google suit, are Ted Cruz and Joe Biden arguing for similar conclusions? As Alan Rozenshtein of the University of Minnesota recently told USA Today, speech cases like these have a way of generating “some very weird coalitions."

That’s another way of describing a broadly held opinion. We should pay attention to those, as they are so often in short supply these days. They are symptoms of widely felt problems, and roadmaps for new possibilities.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in