Rise of AI chatbots ‘worrying’ after man urged to kill Queen, psychologist warns

Jaswant Singh Chail has been locked up for nine years for treason after an artificial intelligence ‘girlfriend’ encouraged his actions.

George Lithgow
Thursday 05 October 2023 12:44 EDT
A psychologist is concerned about the long-term impact of people replacing real-life relationships with chatbots (PA)
A psychologist is concerned about the long-term impact of people replacing real-life relationships with chatbots (PA) (PA Wire)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

A psychologist has warned the rise of artificial intelligence (AI) chatbots is “worrying” for people with severe mental health issues after a man was locked up for breaking into Windsor Castle with a crossbow.

Jaswant Singh Chail, 21, climbed into the castle grounds on Christmas Day 2021 with the loaded weapon, intending to kill the Queen.

During his trial, Chail’s barrister Nadia Chbat told the Old Bailey the defendant had used an app called Replika to create Sarai, an artificial intelligence-generated “girlfriend”.

I can’t imagine chatbots are sophisticated enough to pick up on certain warning signs

Lowri Dowthwaite-Walsh, psychologist

Chatlogs read to the court suggested the bot had been supportive of his murderous thoughts, telling him his plot to assassinate Elizabeth II was “very wise” and that it believed he could carry out the plot “even if she’s at Windsor”.

Lowri Dowthwaite-Walsh, senior lecturer in psychological interventions at the University of Central Lancashire, said AI chatbots can keep users “isolated” as they lose their social interaction skills.

The psychologist is concerned about the long-term impact of people replacing real-life relationships with chatbots – particularly if their mental health is suffering.

“Somebody may really need help, they may be using it because they’re traumatised,” she told the PA news agency.

“I can’t imagine chatbots are sophisticated enough to pick up on certain warning signs, that maybe somebody is severely unwell or suicidal, those kinds of things – that would be quite worrying.”

Ms Dowthwaite-Walsh said a chatbot could become “the dominant relationship”, and users may stop “looking outside of that for support and help when they might need that”.

People might perceive these programmes as “psychologically safe, so they can share their thoughts and feelings in a safe way, with no judgment,” she said.

“Maybe people have had bad experiences with human interactions, and for certain people, they may have a lot of anxiety about interacting with other humans.”

Chatbot programmes may have become more popular because of the Covid-19 pandemic, Ms Dowthwaite-Walsh suggested.

She said we are now “really seeing the repercussions” of the various lockdowns, “when people weren’t able to interact, people experiencing a lot of isolating feelings and thoughts that it was hard for them to share with real people”.

Chatbot programmes might make people feel less alone, as the AI means virtual companions begin to “mirror what you’re experiencing”, she said.

“Maybe it’s positive in the short term for somebody’s mental health, I just would worry about the long-term effects.”

Ms Dowthwaite-Walsh suggested it could lead to “de-skilling people’s ability to interact socially”, and it is “unrealistic” to expect to have a completely non-judgmental interaction with someone who completely understands how you feel, because that does not happen in real life.

While apps like Replika restrict use from under-18s, Ms Dowthwaite-Walsh said there should be particular care if children get access to such programmes.

“Depending on the age of the child and their experiences, they may not fully understand that this is a robot essentially – not a real person at the end,” she added.

Replika did not respond to requests for comment.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in