Artificial intelligence translates thoughts into text using brain implant

Mind-reading AI can turn neural activity into sentences with a 97 per cent accuracy rate

Anthony Cuthbertson
Tuesday 31 March 2020 10:10 EDT
Comments
Facebook and Elon Musk's Neuralink are among the companies working on telepathic technologies
Facebook and Elon Musk's Neuralink are among the companies working on telepathic technologies (Mark Stone/University of Washington)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Scientists have developed an artificial intelligence system that can translate a person’s thoughts into text by analysing their brain activity.

Researchers at the University of California, San Francisco, developed the AI to decipher up to 250 words in real time from a set of between 30 and 50 sentences.

The algorithm was trained using the neural signals of four women with electrodes implanted in their brains, which were already in place to monitor epileptic seizures.

The volunteers repeatedly read sentences aloud while the researchers fed the brain data to the AI to unpick patterns that could be associated with individual words. The average word error rate across a repeated set was as low as 3 per cent.

“A decade after speech was first decoded from human brain signals, accuracy and speed remain far below that of natural speech,” states a paper detailing the research, published this week in the journal Nature Neuroscience.

“Taking a cue from recent advances in machine translation, we trained a recurrent neural network to encode each sentence-length sequence of neural activity into an abstract representation, and then to decode this representation, word by word, into an English sentence.”

The average active vocabulary of an English speaker is estimated to be around 20,000 words, meaning the system is a long way off being able to understand regular speech.

Researchers are unsure about how well it will scale up, as the decoder relies on learning the structure of a sentence and using it to improve its predictions. This means that each new word increases the number of possible sentences, therefore reducing the overall accuracy.

“Although we should like the decoder to learn and to exploit the regularities of the language, it remains to show how much data would be required to expand from our tiny languages to a more general form of English,” the paper states.

One possibility could be to combine it with other brain-computer interface technologies that use different types of implants and algorithms.

Last year, a report by the Royal Society claimed that neural interfaces linking human brains to computers will enable mind reading between people.

The report cited technologies currently being developed by Elon Musk’s Neuralink startup and Facebook, who describe cyborg telepathy as “the next great wave in human-oriented computing”.

Neuralink says learning to use its device is ‘like learning to touch type or play the piano’
Neuralink says learning to use its device is ‘like learning to touch type or play the piano’ (Neuralink)

The Royal Society estimated that such interfaces will be an “established option” for treating diseases like Alzheimer’s within two decades.

“People could become telepathic to some degree, able to converse not only without speaking but without words,” the report stated, while expanding on more futuristic applications like being able to virtually taste and smell without physically experiencing the sensation.

“Someone on holiday could beam a ‘neural postcard’ of what they are seeing, hearing or tasting into the mind of a friend back home.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in