Skip to main content

Brain-computer interface helps patient with locked-in syndrome communicate

An illustration of a human brain and digital circuit.
(Image credit: Yuichiro Chino/Getty Images)

For the first time, a patient in a completely locked-in state due to amyotrophic lateral sclerosis (ALS) was able to communicate verbally using a brain-computer interface, according to a new study. 

This technology allowed the patient, a 37-year old man with ALS, to communicate by forming words and phrases, despite not having any voluntary muscle control. The system involved implanting a device with microelectrodes into the patient's brain, and using a custom computer software to help translate his brain signals.

ALS — also known as motor neuron disease or Lou Gehrig's disease — is a rare neurodegenerative disorder that affects the neurons responsible for the control of voluntary muscle movements. According to the National Institute of Neurological Disorders and Stroke (NINDS) (opens in new tab), this disease causes the degeneration and eventual death of these nerve cells, affecting a person's ability to walk, talk, chew and swallow. 

As the disease gets worse, it causes affected individuals to eventually lose the ability to breathe without assistance from a ventilator or other device and paralyzes nearly all of their muscles. When people develop paralysis of all their muscles except for muscles that control eye movements this is known as a "locked-in state." In order to communicate, people in a locked-in state need to use assistive and augmentative communication devices. 

Related: 10 things you didn't know about the brain

Many of these devices are controlled by eye movement or any facial muscles that are still functional. (For example, Stephan Hawking used a device that allowed him to communicate by moving his cheek muscle, according to Wired (opens in new tab).) But once a person with ALS loses the ability to move these muscles as well, they enter  a "completely locked-in state" that prevents them from communicating with their family, caregivers and the rest of the outside world.  

The patient in the new study (known as patient K1) had lost the ability to walk and talk by the end of 2015, according to the study, published Tuesday (March 22) in the journal Nature Communications (opens in new tab). He started using an eye-tracking based communication device the following year, but eventually could no longer fixate his gaze well enough to use it and was limited to "yes" or "no" communication. Anticipating that he was likely to lose all remaining eye control in the near future and move into a completely locked-in state, he asked his family to help him find an alternative way to communicate with them. 

Patient K1's family reached out to two of the study's authors, Dr. Niels Birbaumer of the Institute of Medical Psychology and Behavioral Neurobiology at the University of Tübingen in Germany, and Dr. Ujwal Chaudhary of the non-profit organization ALS Voice in Mössingen, Germany, who helped set patient K1 up with a non-invasive brain-computer interface system that enabled communication with the remaining eye movement he had. When he eventually lost the ability to move his eyes as well, their team implanted the microelectrode device into his brain as part of the brain-computer interface. 

The system works by using "auditory neurofeedback," which means that the patient had to "match" the frequency of his brain waves to a certain tone, word, or phrase. Matching and holding the frequency at a certain level (for 500 milliseconds) allowed him to achieve a positive or negative response from the system.

As communication with patients in a completely locked-in state has historically not been possible, the team didn't  know whether or not the system would work for patient K1. In fact, "nobody believed that communication is possible in a completely locked-in state," Birbaumer told Live Science. 

Yet, about 3 months after the surgery, patient K1 was able to successfully use neurofeedback to control the brain-computer interface. About half a month later, he started selecting letters and spelling out words and phrases, eventually even thanking the authors and spelling out, "boys, it works so effortlessly."   

According to another member of the team and the study's coauthor, Dr. Jonas Zimmermann of the Wyss Center for Bio and Neuroengineering in Geneva, Switzerland, this showed how patient K1 "was able to use motor areas of the brain to communicate, even though he was not actually able to move at all." And most importantly, Chaudhary said that the system allowed patient K1 to "give specific instructions on how he should be cared for," restoring his voice around his needs, desires and well-being.  

While patient K1 was able to use the neurofeedback-based brain-computer interface to communicate with his family, the system isn't perfect. It still requires constant supervision, or else it may experience technical errors. 

Without supervision by the study team, Zimmermann said that "the system could get stuck in a loop (rejecting all options, or always selecting the first letter, or just selecting random letters)." The team is currently working on alternative ways to deal with this problem, like enabling the system to detect these malfunctions and switch off automatically when they occur.  

The authors also noted that the patient in this case underwent training with a neurofeedback system before he lost complete muscle function, and so it's unclear how well the brain-computer interface system would work if the researchers had started the training when the patient was already in a completely locked-in state.

At the Wyss Center, Zimmermann said that researchers are also working on a new, fully implantable system, which doesn't need an external computer to work, called ABILITY. This system, which is currently undergoing pre-clinical verification, will help improve usability and make the set up and use of the system easier, he said.  

The researchers hope this technology can one day provide a much better experience for patients in a locked-in state, and allow these patients to have a say in decisions involving their care. "However, much more work on the technology needs to be done before it will be widely available," Zimmerman said.

Originally published on Live Science.

Siddhi Camila Lama is an independent science, health and gastronomy writer who is also the managing editor of HairScience.org. She's written for Orb Media, Atlas Obscura, BrainFacts, Medium's science and tech publication, One Zero, and more. Siddhi is a certified nutritionist with a bachelor's in Human Development, a master's in Organ, Tissue, and Cellular Transplantation, and a Ph.D. in Bioengineering.