Elon Musk stoked a storm of homophobic harassment against a former Twitter executive. Here’s the real story

The world’s richest person twisted the words of his former employee Yoel Roth to imply that he was an enabler of child abuse, Io Dodds reports

Wednesday 28 December 2022 09:22 EST
Comments
(Getty Images for The Met Museum/)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

When Twitter's former head of trust and safety Yoel Roth was given the chance to savage Elon Musk, he took a moderate stance.

"People really want him to be the villain of the story, and they want him to be unequivocally wrong and bad, and everything he says is duplicitous," Roth told tech journalist Kara Swisher earlier this month. "That wasn't my experience... he's not the unequivocal villain of the story, and I think it would be unfair to suggest that he is."

But the sense of charity clearly wasn't mutual. On Saturday, Musk used his giant public megaphone to groundlessly imply that Roth – an openly gay Jewish man who was already the target of an ongoing right-wing hate campaign – was a danger to children or an enabler of child abuse.

The suggestion was based on a highly tendentious reading of Roth's PhD thesis and a decade-old tweet, which offer no evidence of support for the sexualisation of children.

That did not stop Twitter users from targeting Roth with a flood of homophobic abuse, false accusations, and threats of violence, while conservative news outlets and commentators picked up Musk's narrative, calling Roth a "groomer" and "sick".

It was a sharp escalation in Musk's affinity for the far right, which has spent this year stoking violence and discrimination against LGBT+ people by equating support for their rights with child sexual abuse. The Independent has asked Musk and Twitter for comment.

Meanwhile, child protection experts are concerned that Musk’s takeover of Twitter could actually worsen its longstanding struggle to stop child abuse on its platform.

So what did Musk claim, and what's the real story behind his tweets?

Musk twists the facts about a 2016 PhD thesis

Musk's initial posts about Roth came in response to a tweet from an anti-child-trafficking activist known as Eliza Bleu, who has long accused Twitter of turning a blind eye to child abuse.

"I think I may have found the problem @ElonMusk," Bleu said, sharing a tweet from Roth in 2010 that asked: "Can high school students ever meaningfully consent to sex with their teachers?"

Yet Roth's question didn't come from nowhere. He was linking to, and apparently summarising, a news story by Salon about a high school teacher in Washington state who was arrested for sleeping with an 18-year-old student.

Matthew Hirschfelder, then 33, was charged with sexual misconduct involving a minor, setting off a lengthy appeals process in which different judges disagreed about whether an 18-year-old student could legally be regarded as a "minor". Roth offered no answer to or commentary on that question.

Despite this, Musk replied to Bleu claiming "this explains a lot", before posting a screenshot from Roth's 2016 PhD thesis Gay Data. "Looks like Yoel is arguing in favour of children being able to access adult Internet services,” Musk added.

This, again, is misleading. Roth's actual argument was that since LGBT+ under-18s already use Grindr and other big social networks such as Twitter and Facebook, these services should consider whether they can safely cater to that audience – while noting that in Grinder's case this may be impossible.

"Even with the service’s extensive content management, Grindr may well be too lewd or too hook-up-oriented to be a safe and age-appropriate resource for teenagers," says the thesis, which was removed from the University of Pennsylvania's website some time after Saturday afternoon.

"But," it goes on, "the fact that people under 18 are on these services already indicates that we can’t readily dismiss these platforms out of hand as loci for queer youth culture.

"Rather than merely trying to absolve themselves of legal responsibility or, worse, trying to drive out teenagers entirely, service providers should instead focus on crafting safety strategies that can accommodate a wide variety of use cases for platforms like Grindr – including, possibly, their role in safely connecting queer young adults."

The thesis was part of Roth's pre-Twitter career as an academic researching the interactions between gay culture and technology. This specific extract follows a summary of several "chilling" sexual assault cases where the perpetrators met their victims through Grindr, and segues into a discussion about how far dating apps can or should go in trying to exclude under-age users or verify users' identities.

While Roth critiques media reports of these crimes for failing to recognise the internet's role as a "crucial social outlet" for young LGBT+ people, he takes the actual incidents seriously, and argues that Grindr's core design features may have contributed to them.

Ironically, in the first few weeks of Musk’s management, Roth put his credibility on the line to publicly defend Twitter’s new boss and his policies, leading Musk to praise him and his “integrity”.

Roth told Swisher that he chose to resign once it became clear that Musk planned to run Twitter “by dictatorial edict rather than by policy”.

Twitter’s longstanding struggle with child abuse

The background for Musk's attack on Roth is Twitter's genuine and longstanding problem with child pornography, often referred to by experts and law enforcement officials as child sexual abuse material (CSAM).

Twitter has long insisted that it has "zero tolerance" for such material, and it's far from alone in struggling to contain it. Yet its tolerance of adult porn that is banned on other social networks makes it harder to police for CSAM, and its historical failure to turn a profit means it lacks the resources of larger rivals such as Facebook.

In 2020, Twitter was accused of aiding child abuse by allowing self-declared "non-offending" paedophiles to openly organise and form associations on its service, which allegedly gave some users cover to share actual abuse material.

In 2021, a lawsuit alleged that Twitter had refused to take down videos of two 13-year-olds being sexually abused despite being given proof that both were minors. Numerous child protection charities filed legal briefs in support of the plaintiffs.

In August this year, The Verge reported that senior executives had failed to commit enough funding to catch child abuse despite repeated warnings from employees. “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale," said an internal report in April.

And in September, more than 30 advertisers froze their spending on the network after an investigation by Reuters and the cybersecurity group Ghost Data found that Twitter's automated advertising systems had placed their promotions alongside tweets soliciting child abuse content.

As Twitter's director of trust and safety since August 2020, Roth would have been responsible for enforcing Twitter's policies against CSAM. We don't know which side of these internal debates he was on, and he has not publicly commented on the stories above.

However, criticising Roth's actions at Twitter is not the same thing as twisting his words to misrepresent his beliefs. Nor is it clear whether Musk will do any better.

Supporters of Musk's leadership – as well as some far right activists – have sought to paint him as a scourge of child abuse who will sweep away the problems Twitter's previous leaders failed to address. Musk has leaned into this idea, declaring that stopping child abuse is his "number one priority" and publicly sparring with Twitter's former chief executive Jack Dorsey.

"It is a crime that they refused to take action on child exploitation for years!" Musk said on Friday, before his tweets singling out Roth. He also alleged that Dorsey's successor Parag Agrawal and chief financial officer Ned Segal had denied a request for more staff devoted to CSAM.

Musk's advocates have touted a statistic released by Twitter claiming that it suspended 57 per cent more accounts for violating its CSAM rules in November (after Musk's takeover) versus the previous month, as well as the blocking of several hashtags that had been used to share and solicit such material.

Yet the 57 per cent statistic measures only the raw number of accounts suspended, rather than the proportion of accounts that are actually caught. That means it can be driven up simply because there is more CSAM being shared on the service in any given month, and tells us little about how good Twitter is at spotting it.

Carolina Christofoletti, a CSAM researcher at the University of Sao Paolo in Brazil, has also claimed that the hashtag blocking was already in motion before Musk became chief executive, in response to September's advertising scandal.

Conversely, Musk has reportedly laid off around 80 per cent of Twitter's contractor workforce – which includes many content moderators – and cut the size of its child abuse team. “No one is left who is an expert on these issues and can do the work... it’s a ghost town," one employee told Forbes.

According to New York Times reporter Michael Keller, Twitter failed to attend an annual meeting held with teach leaders by the National Center for Missing and Exploited Children (NCMEC) – an influential and widely respected US child protection charity – for the first time in the event's history.

NCMEC has also warned Musk that a plan to let Twitter users charge money for adult content, which he is reportedly considering reviving, would "likely provide yet another avenue for abuse to thrive on Twitter."

Although Twitter's new head of trust and safety Ella Irwin has said that the under-staffing of CSAM teams predates Musk, it is hard to see how his massive layoffs – which by some estimates have slashed Twitter's headcount by around 70 per cent – can help it stop child exploitation.

Echoes of QAnon and the ‘groomer’ panic

Zooming out, Musk's tweets about Roth are part of a disturbing pattern in which the world's richest person encourages or amplifies smears against people he dislikes that then lead to waves of harassment.

For years, critics and journalists whom Musk replies to on Twitter – especially female ones – have faced swarms of invective from his fans, leading many to become reluctant about publicly opposing him.

In 2018, Musk sent emails to Buzzfeed News baselessly suggesting that Vernon Unsworth, a British cave diver who had criticised him, was a "child rapist" who had moved to Thailand for "child sex trafficking". In court, Musk later admitted he had been "tricked" by a convicted fraudster claiming to be a private investigator who offered to look into Unsworth's past.

This April, Twitter's head of policy and legal affairs Vijaya Gadde, who was closely involved in the company's decision to ban Donald Trump, received torrents of racist invective after being publicly blasted by Musk.

On Friday Musk also joined in attacks on three former members of Twitter's advisory trust and safety council, who on Thursday resigned in protest against his actions as chief executive. They too received streams of harassment accusing them of turning a blind eye to paedophilia, although as advisers they were not in control of Twitter's policies or enforcement.

Over the past few years, false accusations of child abuse or supporting child abuse have become a stock in trade for online extremists, presaged in 2016 by the "Pizzagate" conspiracy theory and popularised by the explosive growth of its successor QAnon during the first year of the Covid-19 pandemic.

QAnon is a cult-like millenarian movement linked to numerous violent acts and plots, whose adherents believe that political opponents, business leaders, and celebrities are part of a Satanic paedophile cabal.

This year, many Republican legislators and officials embraced the idea that teaching children about LGBT+ life or helping transgender children to transition is a form of child abuse, adopting the incendiary term "groomer" as a catch-all insult for supporters of LGBT+ rights.

Notably, conservative pundits and social media influencers had already begun to focus on Roth more than a week before Musk's tweets. Much of the criticism focused on his personal opposition to Donald Trump and his role in Twitter's misinformation and incitement to policies, which detractors argue amounted to politically-motivated censorship of conservative voices.

Some, however, focused on his research and sexuality, and Roth has posted extracts of homophobic and antisemitic abuse he received via email.

"It's terrifying," Roth said in his interview with Swisher. "When you get targeted in some of these ways, it's hard to differentiate between what is somebody just trying to rattle you online, and what's a real threat.

"You see in things like Pizzagate that online conspiracies can mobilise very real and very direct offline violence... and when you have 111 million Twitter followers, everything that you tweet out can mobilise exactly some of those same scary people."

Asked whether he felt Musk, who now has 121 million followers, would heed these concerns, Roth said: "I think it may be hard for him to understand the consequences that his tweets can have for the people that he targets. And I truly hope for my safety and for my family's safety that I'm not targeted again."

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in