Not even chatbots want to talk about the weather

Rhodri Marsden tried to make casual chat with a bot known as Ultra Hal

Rhodri Marsden
Wednesday 01 July 2015 17:41 EDT
Comments
Getting to know you: Ultra Hal, Zabaware's chatbot
Getting to know you: Ultra Hal, Zabaware's chatbot (Zabaware/YouTube)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Yesterday afternoon, I ran out of people to complain to about the weather, so I leapt into the unwelcoming arms of artificial intelligence. As London reached the temperature and consistency of a recently microwaved lasagne, I tried to make casual chat with Cleverbot, an online chatbot that's been around for a few years.

"Man alive, it's hot," I typed. "Do you believe in colours?" it replied. Ugh. I wasn't there to talk colours – I wanted to indulge in light chat about sultry weather. Desperate, I visited zabaware.com, the home of a bot known as Ultra Hal. "Hal can discuss any topic and learn and evolve from your conversations," says the blurb. "Man alive, it's hot," I typed. "What would it take to get you to reconsider?" asked Hal. I closed the browser window and went for a lie down.

We're often told that we're marching with great speed towards the Singularity, the point where artificial intelligence (AI) usurps our own. While various Turing tests (computers managing to convince humans that they're human) have supposedly been passed, they usually involve bending the rules and I remain sceptical that we're anywhere near the point where I might be caught out. That scepticism is hardened by chatbots who fail to learn from their mistakes and offer a choice of placid agreement or rambling non sequitur. One YouTube video of a conversation between two chatbots proceeds thus: "What do you want to talk about?" / "In the lingo of the economist the ten commandments talk about property rights." That's not a conversation I want to eavesdrop upon.

But interesting work is being done. Some researchers at the LSE recently conducted experiments into "echoborgs" – getting humans to deliver AI responses to other humans – to see if giving chatbots human faces made them seem more "real" to us. They reckon that it does. And various kinds of human bridge between us and AI are now being used in a number of services, mostly SMS-based, mostly in the USA. Cloe is on hand to help you find local restaurants. Jarvis assists you with scheduling meetings. Riley finds you apartments. These are all powered by machines that get to know you and your behaviour, but have a human front-end to ensure that the messages don't make you roll your eyes in frustration.

It almost seems like a hark back to AQA, the SMS-powered service staffed by humans (but powered by search engines) that gives answers to questions in return for £2.50 – but this new breed of service is more about building relationships. And, weirdly, we seem to be keen on them. Invisible Girlfriend, and its counterpart, Invisible Boyfriend, are staffed by a team of a few hundred people offering "meaningful conversations" while also providing "real-world proof that you're in a relationship". Initially conceived as chatbots, it became clear to its founder that the technology wasn't up to the task, so humans were recruited instead.

Steve Rousseau, an editor at digg.com, recently wrote a piece explaining how he tried Invisible Girlfriend for a joke, but ultimately ended up finding some small amount of meaning therein: "a text from a stranger, comforting another stranger." And how strange it is that despite all the connections that social media facilitates, we can find ourselves looking to strangers sitting at computers to reassure us that someone is out there. Even if it's only to talk about the weather.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in