Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Heated words after Instagram chief says posts viewed by Molly Russell were safe

Elizabeth Lagone apologised for content Molly viewed that ‘violated our policies’.

Josh Payne
Monday 26 September 2022 12:31 EDT
Undated family handout file photo of Molly Russell. Social media content viewed by a teenager in the weeks before she took her own life is too disturbing for even an adult to look at for a long period of time, a coroner’s court has heard.
Undated family handout file photo of Molly Russell. Social media content viewed by a teenager in the weeks before she took her own life is too disturbing for even an adult to look at for a long period of time, a coroner’s court has heard. (PA Media)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

An Instagram executive was involved in a heated exchange about allowing children on the site during an inquest into the death of schoolgirl Molly Russell, in which the family’s lawyer shouted “why on earth are you doing this?”

Meta’s head of health and wellbeing, Elizabeth Lagone, said content viewed by the 14-year-old on the platform, which her family argued “encourages” suicide and self-harm, was safe.

Despite defending the site throughout the hearing, Ms Lagone apologised for content that “violated our policies” which was viewed by Molly before she died.

Towards the end of her evidence, the lawyer acting on behalf of Molly’s parents, Oliver Sanders KC, raised his voice before asking why Instagram permitted children on the platform when it was “allowing people to put potentially harmful content on it”.

Mr Sanders suggested Meta “could just restrict it to adults”, before forcibly putting down his folder and saying Instagram had “no right” to decide what content children could view.

He shouted: “You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don’t have the capacity to consent to this.”

Ms Lagone was taken through a number of posts the schoolgirl engaged with on the platform in the last six months of her life, which she described as “by and large, admissive”.

The senior executive told an inquest at North London Coroner’s Court she thought it was “safe for people to be able to express themselves”, but conceded a number of posts shown to the inquest would have violated Instagram’s policies.

Molly, from Harrow in north-west London, died in November 2017, prompting her family to campaign for better internet safety.

During Monday’s proceedings, videos the teenager accessed on Instagram were played to the court with the coroner once again warning the material had the “potential to cause great harm”.

He said the content “seeks to romanticise and in some way validate the act of harm to young people,” before urging anyone who wanted to leave the room to do so, with one person leaving.

The clips contained references to popular culture, with the witness saying it was allowed on the platform in 2017 because they were “fictional montages”.

Addressing Ms Lagone in the witness box, Mr Sanders prompted an interjection from Meta’s lawyer, Caoilfhionn Gallagher KC, who asked for the coroner to remind him of how witnesses should be questioned in an inquest.

Turing to Mr Sanders, the coroner said: “You have put your point.”

Before the interjection, Mr Sanders raised his voice and asked: “The point is this isn’t it, Instagram portrays itself as trying to strike this very, very difficult balance between who gets harmed by content you are posting, you are balancing and running risks and it really comes back to the question the coroner asked you; why on earth are you doing this?”

Ms Lagone told the inquest the topic of harm was an “evolving field” and that Instagram policies were designed with consideration to users aged 13 and over.

Earlier in her evidence, Mr Sanders said to the witness: “I suggest to you that it is inherently unsafe environment… dangerous and toxic for 13 to 14-year-olds alone in their bedrooms scrolling through this rubbish on their phones.”

“I respectfully disagree,” Ms Lagone responded.

The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.

Referring to all the material viewed by the teenager the family considered to be “encouraging” suicide or self-harm, Mr Sanders continued: “Do you agree with us that this type of material is not safe for children?”

Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.

“Do you think this type of material is safe for children?” Mr Sanders continued.

Ms Lagone said: “I think it is safe for people to be able to express themselves.”

After Mr Sanders asked the same question again, Ms Lagone said: “Respectfully, I don’t find it a binary question.”

The coroner interjected and asked: “So you are saying yes, it is safe or no, it isn’t safe?”

“Yes, it is safe,” Ms Lagone replied.

The coroner continued: “So having created this environment, you then seek to make it safe?

Ms Lagone replied: “Certainly, we take the safety of users very seriously…”

“What did people do before people created this environment for them?” the coroner asked.

“I’m not sure what you mean,” Ms Lagone said.

“You create the danger and then you take steps to lessen the risk and danger?” the coroner said.

Ms Lagone replied: “The technology has been developed and… we take our responsibility seriously to have the right policies and processes in place.”

Responding to questioning Mr Sanders KC about whether she was sorry about the content Molly saw, Ms Lagone told the court: “We are sorry that Molly saw content that violated our policies and we don’t want that on the platform.”

The inquest, expected to last two weeks, continues.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in