Facebook says it can read thoughts with mind-reading device
'We're standing on the edge of the next great wave in human-orientated computing,' Facebook says
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook is working on a headset that can transfer a person’s thoughts directly onto a computer screen using a brain-machine interface.
A paper describing the technology, published in Nature Communications, reveals how the headset is able to decode brain activity to instantly transcribe what a person is saying into text on a computer screen.
The algorithm that decodes the activity is currently only able to recognise a small set of words and phrases but the technology giant said that its non-invasive, wearable device could one day allow people with paralysis to communicate.
The technology could also transform the utility of augmented reality glasses and virtual reality headsets through thought-based controls.
“Being able to recognise even a handful of imagined commands, like ‘home’, ‘select’, and ‘delete’ would provide entirely new ways of interacting with today’s VR systems – and tomorrow’s AR glasses,” Facebook wrote in a blog post describing the brain-computer interface device.
“Imagine a world where all the knowledge, fun, and utility of today’s smartphones were instantly accessible and completely hands-free... A decade from now, the ability to type directly from our brains may be accepted as a given. Not long ago, it sounded like science fiction. Now, it feels within plausible reach.”
Facebook first revealed its ambitions to read people’s minds in 2017 at its annual F8 conference, when Regina Dugan took to the stage and asked the question: “What if you could type directly from your brain?”
Since then, the company has been building a headset to make this a reality through its Facebook Reality Labs, as well as with collaborations with some of the world’s leading universities.
It is not the only firm working on brain-machine interfaces with the hope of one day commercialising the technology.
Earlier this month, Elon Musk-founded startup Neuralink revealed its own device that can connect human brains directly to computers.
The key difference between Neuralink’s “threads” and Facebook’s headset is that Facebook’s device is non-invasive and does not require any form of surgery.
It works instead by measuring brain activity from receivers placed around a person’s head.
Given Facebook’s track record with its users’ privacy, the researchers pointed out that it was important to take safety and security into consideration when developing the device.
“We can’t anticipate or solve all of the ethical issues associated with this technology on our own. What we can do is recognise when the technology has advanced beyond what people know is possible, and make sure that information is delivered back to the community,” said Mark Chevillet, director of the brain-computer interface program at Facebook Reality Labs.
“Neuroethical design is one or our programme’s key pillars - we want to be transparent about what we’re working on so that people can tell us their concerns about this technology.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments