Imagine being completely paralyzed, to the point where you are unable to control your muscles and can only communicate using your eyes.
This is the dark reality for individuals with degenerative disorders like ALS. But a new brain implant, built by the medical technology company Medtronic, may finally allow these patients to better communicate.
The implant, which is described in a study published this week in The New England Journal of Medicine, is a brain-computer interface that allows a person to control a computer or other electronic device using only his or her brain waves, with no movement required. According to Dr. Nick Ramsey, a professor of cognitive neuroscience at University Medical Center Utrecht in the Netherlands who led the study, such systems have been in laboratory development for Parkinson’s patients for decades, and he and his team wanted to apply the same concept to ALS treatment.
“We thought we should be able to bring this into the homes of people who really need it,” he told the Observer.
A 58-year-old Dutch woman with ALS, who until recently had to use blinks to denote “yes” or “no,” volunteered to test the implant. The doctors placed four electrodes in her motor cortex (the region of the brain that controls voluntary movement) and transmitted brain signals wirelessly through a coil taped to her shirt.
This entire system was contained on a Surface Pro tablet and controlled by the patient’s “brain clicks”—whenever she was presented with an objective she tried to move her hand, and the electricity in her brain increased due to the nerve activity. The onscreen cursor, powered by her brain’s electrodes, was then able to move to the desired spot on the screen.
Ramsey designed several computer games to help the patient learn how to control the device. In the first challenge, which Ramsey likened to Pong, she attempted to hit a target on the video screen by moving the cursor up and down. Next, she learned how to time her brain signals by moving an image of a ball up and down when prompted by the computer. She also used a game similar to whack-a-mole to figure out how to select specific items on the screen.
The final objective, however, was to get the patient to spell out words and communicate with her caregivers. She gradually learned how to highlight letters using brain clicks—at first the words were dictated to her, but eventually she began communicating her own messages at a rate of three to five letters a minute.
One challenge the doctors ran into was that at the beginning of the study the patient expended too much energy on the tasks.
“The brain signal was noisier than we expected,” Ramsey said, referring to the patient’s increased neurological activity. “We found at every step that we needed to develop new pieces of software to read the signals properly and figure out her intentions.”
But by the six month mark the patient was completing the tasks with little difficulty, and her spelling was correct 89 percent of the time.
Ramsey said that the patient has continued using the tablet system since the study concluded in July, and that she wanted to help with further research.
Now that the software issues have been worked out, future patients could learn to use the full system within two months according to Ramsey.
One thing that needs to be improved on in future iterations is the navigation speed of the device—in the current version the patient must highlight an entire row of text and navigate to the specific letter she wants, hence the limit of three to five letters per minute.
Ramsey wants to develop a more advanced implant which allows patients to automatically click on letters, but he said this would not be available for four to five years (though a commercial version of the current device should be available before that).
While the current system is slow, it’s still an exciting development according to Dr. Dale Lange, chair of the department of neurology at Manhattan’s Hospital for Special Surgery.
“Anytime you can take thoughts and transform them into movements is groundbreaking,” Lange told the Observer. “It’s an amazing start. We desperately need new communication devices for people who have lost their ability to move.”
Lange is also hopeful about future research, though he admitted that scientific progress is slow.
“Does it give people hope tomorrow? No,” he said. “But we can build on these findings and have meaningful communication. Think about the phenomenon of being able to move fingers that don’t move. It’s mind blowing.”
Other ALS advocates cheered Ramsey’s discovery as well. Clare Durrett is the associate executive director of Team Gleason, a charity which works to empower people with ALS and work toward a cure. It was founded by Steve Gleason, a former safety with the New Orleans Saints who was diagnosed with ALS in 2011 and also uses a Surface Pro to communicate—he types about 20-30 words per minute with his eyes.
“Steve has always said, ‘until there is a medical cure for ALS, technology can be that cure,'” Durrett said in an email. “This research is encouraging and is another step toward breaking down the physical barriers created by a disease like ALS.”