鶹ý

Mind

Mind-reading AI can turn even imagined speech into spoken words

A brain-computer interface has enabled people with paralysis to turn their thoughts directly into words, requiring less effort than older techniques where a physical attempt at speech had to be made

By Christa Lesté-Lasserre

14 August 2025

Someone with paralysis using the brain-computer interface. The text above is the cued sentence and the text below is what is being decoded in real-time as she imagines speaking the sentence

Someone with paralysis using the brain-computer interface. The text above is the cued sentence and the text below is what is being decoded in real-time as she imagines speaking the sentence

Emory BrainGate Team

People with paralysis can now have their thoughts turned into speech just by imagining talking in their heads.

While brain-computer interfaces can already decode the neural activity of people with paralysis when they physically attempt speaking, this can require a fair amount of effort. So  at Stanford University and his colleagues sought a less energy-intensive approach.

“We wanted to see whether there were similar patterns when someone was simply imagining speaking in their head,” he says. “And we found that this could be an alternative, and indeed, a more comfortable way for people with paralysis to use that kind of system to restore their communication.”

Abramovich Krasa and his colleagues recruited four people with severe paralysis as a result of either amyotrophic lateral sclerosis (ALS) or brainstem stroke. All the participants had previously had microelectrodes implanted into their motor cortex, which is involved in speech, for research purposes.

The researchers asked each person to attempt to say a list of words and sentences, and also to just imagine saying them. They found that brain activity was similar for both attempted and imagined speech, but activation signals were generally weaker for the latter.

Free newsletter

Sign up to Eight Weeks to a Healthier You

Your science-backed guide to the easy habits that will help you sleep well, stress less, eat smarter and age better.

鶹ý. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

The team trained an AI model to recognise those signals and decode them, using a vocabulary database of up to 125,000 words. To ensure the privacy of people’s inner speech, the team programmed the AI to be unlocked only when they thought of the password Chitty Chitty Bang Bang, which it detected with 98 per cent accuracy.

Through a series of experiments, the team found that just imaging speaking a word resulted in the model correctly decoding it up to 74 per cent of the time.

This demonstrates a solid proof-of-principle for this approach, but it is less robust than interfaces that decode attempted speech, says team member , also at Stanford. Ongoing improvements to both the sensors and AI over the next few years could make it more accurate, he says.

The participants expressed a significant preference for this system, which was faster and less laborious than those based on attempted speech, says Abramovich Krasa.

The concept takes “an interesting direction” for future brain-computer interfaces, says at UMC Utrecht in the Netherlands. But it lacks differentiation between attempted speech, what we want to be speech and the thoughts we want to keep to ourselves, she says. “I’m not sure if everyone was able to distinguish so precisely between these different concepts of imagined and attempted speeches.”

She also says the password would need to be turned on and off, in line with the user’s decision of whether to say what they’re thinking mid-conversation. “We really need to make sure that BCI [brain computer interface]-based utterances are the ones people intend to share with the world and not the ones they want to keep to themselves no matter what,” she says.

at Durham University in the UK says there is no reason to consider this system a mind-reader. “It really only works with very simple examples of language,” he says. “I mean if your thoughts are limited to single words like ‘tree’ or ‘bird,’ then you might be concerned, but we’re still quite a way away from capturing people’s free-form thoughts and most intimate ideas.”

Willett stresses that all brain-computer interfaces are regulated by federal agencies to ensure adherence to “the highest standards of medical ethics”.

Journal reference:

Cell

Topics:

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with 鶹ý events and special offers.

Sign up
Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop