New Brain Device Is First to Read Out Inner Speech
A new brain prosthesis can read out inner thoughts in real time, helping people with ALS and brain stem stroke communicate fast and comfortably
After a brain stem stroke left him almost entirely paralyzed in the 1990s, French journalist Jean-Dominique Bauby wrote a book about his experiences—letter by letter, blinking his left eye in response to a helper who repeatedly recited the alphabet. Today people with similar conditions often have far more communication options. Some devices, for example, track eye movements or other small muscle twitches to let users select words from a screen.
And on the cutting edge of this field, neuroscientists have more recently developed brain implants that can turn neural signals directly into whole words. These brain-computer interfaces (BCIs) largely require users to physically attempt to speak, however—and that can be a slow and tiring process. But now a new development in neural prosthetics changes that, allowing users to communicate by simply thinking what they want to say.
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
But the motor cortex doesn’t only light up when we attempt to speak; it’s also involved, to a lesser extent, in imagined speech. The researchers took advantage of this to develop their “inner speech” decoding device and published the results on Thursday in Cell. The team studied three people with amyotrophic lateral sclerosis (ALS) and one with a brain stem stroke, all of whom had previously had the sensors implanted. Using this new “inner speech” system, the participants needed only to think a sentence they wanted to say and it would appear on a screen in real time. While previous inner speech decoders were limited to only a handful of words, the new device allowed participants to draw from a dictionary of 125,000 words.
A participant is using the inner speech neuroprosthesis. The text above is the cued sentence, and the text below is what’s being decoded in real-time as she imagines speaking the sentence.
To ensure private thoughts remained private, the researchers implemented a code phrase: “chitty chitty bang bang.” When internally spoken by participants, this would prompt the BCI to start or stop transcribing.
For Kunz, this research is particularly close to home. “My father actually had ALS and lost the ability to speak,” she says, adding that this is why she got into her field of research. “I kind of became his own personal speech translator toward the end of his life since I was kind of the only one that could understand him. That’s why I personally know the importance and the impact this sort of research can have.”
Emma R. Hasson is a Ph.D. candidate in mathematics at the City University of New York Graduate Center with expertise in math education and communication. Hasson is also a 2025 AAAS Mass Media Fellow at Scientific American.
Source: www.scientificamerican.com