A New Brain Implant Can Decode The ‘Inner Monologue’ Taking Place In A Person’s Mind

Human-brain-neuron-activity

iStockphoto


Scientists have developed a new brain-computer interface that can capture and decode the “inner monologue” taking place in a person’s mind. It is hoped that this advancement in brain-computer interface technology will make it easier for people who are unable to speak due to severe paralysis or other neurological conditions to communicate.

The scientists were able to achieve this monumental breakthrough by pinpointing brain activity related to inner speech and using the brain-computer interface technology to interpret the brain activity and type out what the person is attempting to say, even if the speech itself is unintelligible, according to a statement announcing their achievement.

“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” said the study’s co-author Erin Kunz, an electrical engineer at Stanford University. “For people with severe speech and motor impairments, [brain-computer interfaces] capable of decoding inner speech could help them communicate much more easily and more naturally.”

Decoding a person’s inner speech

As the researchers explained, other systems that track users’ eye movements to type out words can be tiring and slow for people with limited muscle control. By figuring out how to decode a person’s inner speech, they hope to make the process much simpler.

“If you just have to think about speech instead of actually trying to speak, it’s potentially easier and faster for people,” said study co-author Benyamin Meschede-Krasa, a neuroscience Ph.D. student at Stanford University.

The team recorded neural activity from microelectrodes implanted in the motor cortex — a brain region responsible for speaking — of four participants with severe paralysis from either amyotrophic lateral sclerosis (ALS) or a brainstem stroke. The researchers asked the participants to either attempt to speak or imagine saying a set of words. They found that attempted speech and inner speech activate overlapping regions in the brain and evoke similar patterns of neural activity, but inner speech tends to show a weaker magnitude of activation overall.

Using the inner speech data, the team trained artificial intelligence models to interpret imagined words. In a proof-of-concept demonstration, the BCI could decode imagined sentences from a vocabulary of up to 125,000 words with an accuracy rate as high as 74%.

The researchers also demonstrated a password-controlled mechanism which allows the person to stop the brain-computer interface from decoding their inner speech unless it is unlocked. The password was recognized by the system with more than 98% accuracy.

“The future of BCIs is bright,” said study co-author Frank Willett, assistant professor of neurosurgery at Stanford University. “This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech.”

Douglas Charles headshot avatar BroBible
Douglas Charles is a Senior Editor for BroBible with two decades of expertise writing about sports, science, and pop culture with a particular focus on the weird news and events that capture the internet's attention. He is a graduate from the University of Iowa.