Tay tweets: Microsoft apologises for robot’s racist and genocidal tweets

‘Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay’, the company said, leading to it tweeting almost every kind of offensive message

Andrew Griffin
Sunday 27 March 2016 07:45 EDT
Comments
HAL, the out of control computer in 2001: A Space Odyssey
HAL, the out of control computer in 2001: A Space Odyssey (Warner Bros)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Microsoft has apologised after a robot it madetweeted wildly inappropriate and reprehensible words and images” that included support for Hitler and genocide.

The company launched Tay, an artificially intelligent robot, on Twitter last week. It was intended to be a fun way of engaging people with AI – but instead was tricked by people into tweeting out support of Hitler and genocide, and repeated white power messages.

Microsoft said that it had no way of knowing that people would attempt to trick the robot into tweeting the offensive words, but apologised for letting it do so.

Microsoft said that it had launched Tay after success with a similar robot, XiaoIce, in China. That is used by 40 million people, who hold conversations with it, and it has even presented the TV weather.

“The great experience with XiaoIce led us to wonder: Would an AI like this be just as captivating in a radically different cultural environment?” wrote Peter Lee, Microsoft’s vice president for research. “Tay – a chatbot created for 18- to 24- year-olds in the US for entertainment purposes – is our first attempt to answer this question.”

What Microsoft found however was that launching the robot into the English language, American Twitter would lead to people attempting to attack and trick it. It said that it had launched on the social network as a way of reaching a large number of users – but instead a small number of them were able to take advantage of it.

“Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay,” wrote Mr Lee. “Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack.

“As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time.”

The company said that the experience had shown the difficult challenges that face it in AI design.

“AI systems feed off of both positive and negative interactions with people,” Mr Lee wrote. “In that sense, the challenges are just as much social as they are technical.

“We will do everything possible to limit technical exploits but also know we cannot fully predict all possible human interactive misuses without learning from mistakes.”

The company said that it would hope to learn from the experience and would keep attempting to build a robot and an internet "that represents the best, not the worst, of humanity”.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in