In a Stanford-led research report, three participants with movement impairment controlled an onscreen cursor simply by imagining their own hand movements.
A clinical research publication led by Stanford University investigators has demonstrated that a brain-to-computer hookup can enable people with paralysis to type via direct brain control at the highest speeds and accuracy levels reported to date.
The report involved three study participants with severe limb weakness — two from amyotrophic lateral sclerosis, also called Lou Gehrig’s disease, and one from a spinal cord injury. They each had one or two baby-aspirin-sized electrode arrays placed in their brains to record signals from the motor cortex, a region controlling muscle movement. These signals were transmitted to a computer via a cable and translated by algorithms into point-and-click commands guiding a cursor to characters on an onscreen keyboard.
Each participant, after minimal training, mastered the technique sufficiently to outperform the results of any previous test of brain-computer interfaces, or BCIs, for enhancing communication by people with similarly impaired movement. Notably, the study participants achieved these typing rates without the use of automatic word-completion assistance common in electronic keyboarding applications nowadays, which likely would have boosted their performance.
One participant, Dennis Degray of Menlo Park, California, was able to type 39 correct characters per minute, equivalent to about eight words per minute.
‘A major milestone’
This point-and-click approach could be applied to a variety of computing devices, including smartphones and tablets, without substantial modifications, the Stanford researchers said.
“Our study’s success marks a major milestone on the road to improving quality of life for people with paralysis,” said Jaimie Henderson, MD, professor of neurosurgery, who performed two of the three device-implantation procedures at Stanford Hospital. The third took place at Massachusetts General Hospital.
Henderson and Krishna Shenoy, PhD, professor of electrical engineering, are co-senior authors of the study, which was published online Feb. 21 in eLife. The lead authors are former postdoctoral scholar Chethan Pandarinath, PhD, and postdoctoral scholar Paul Nuyujukian, MD, PhD, both of whom spent well over two years working full time on the project at Stanford.
“This study reports the highest speed and accuracy, by a factor of three, over what’s been shown before,” said Shenoy, a Howard Hughes Medical Institute investigator who’s been pursuing BCI development for 15 years and working with Henderson since 2009. “We’re approaching the speed at which you can type text on your cellphone.”
“The performance is really exciting,” said Pandarinath, who now has a joint appointment at Emory University and the Georgia Institute of Technology as an assistant professor of biomedical engineering. “We’re achieving communication rates that many people with arm and hand paralysis would find useful. That’s a critical step for making devices that could be suitable for real-world use.”
Shenoy’s lab pioneered the algorithms used to decode the complex volleys of electrical signals fired by nerve cells in the motor cortex, the brain’s command center for movement, and convert them in real time into actions ordinarily executed by spinal cord and muscles.
“These high-performing BCI algorithms’ use in human clinical trials demonstrates the potential for this class of technology to restore communication to people with paralysis,” said Nuyujukian.
Millions of people with paralysis reside in the United States. Sometimes their paralysis comes gradually, as occurs in ALS. Sometimes it arrives suddenly, as in Degray’s case.
Now 64, Degray became quadriplegic on Oct. 10, 2007, when he fell and sustained a life-changing spinal-cord injury. “I was taking out the trash in the rain,” he said. Holding the garbage in one hand and the recycling in the other, he slipped on the grass and landed on his chin. The impact spared his brain but severely injured his spine, cutting off all communication between his brain and musculature from the head down.
“I’ve got nothing going on below the collarbones,” he said.
Degray received two device implants at Henderson’s hands in August 2016. In several ensuing research sessions, he and the other two study participants, who underwent similar surgeries, were encouraged to attempt or visualize patterns of desired arm, hand and finger movements. Resulting neural signals from the motor cortex were electronically extracted by the embedded recording devices, transmitted to a computer and translated by Shenoy’s algorithms into commands directing a cursor on an onscreen keyboard to participant-specified characters.
The researchers gauged the speeds at which the patients were able to correctly copy phrases and sentences — for example, “The quick brown fox jumped over the lazy dog.” Average rates were 7.8 words per minute for Degray and 6.3 and 2.7 words per minute, respectively, for the other two participants.
A tiny silicon chip
The investigational system used in the study, an intracortical brain-computer interface called the BrainGate Neural Interface System*, represents the newest generation of BCIs. Previous generations picked up signals first via electrical leads placed on the scalp, then by being surgically positioned at the brain’s surface beneath the skull.
An intracortical BCI uses a tiny silicon chip, just over one-sixth of an inch square, from which protrude 100 electrodes that penetrate the brain to about the thickness of a quarter and tap into the electrical activity of individual nerve cells in the motor cortex.
This is like one of the coolest video games I’ve ever gotten to play with.
Henderson likened the resulting improved resolution of neural sensing, compared with that of older-generation BCIs, to that of handing out applause meters to individual members of a studio audience rather than just stationing them on the ceiling, “so you can tell just how hard and how fast each person in the audience is clapping.”
Shenoy said the day will come — closer to five than 10 years from now, he predicted —when a self-calibrating, fully implanted wireless system can be used without caregiver assistance, has no cosmetic impact and can be used around the clock.
“I don’t see any insurmountable challenges.” he said. “We know the steps we have to take to get there.”
Degray, who continues to participate actively in the research, knew how to type before his accident but was no expert at it. He described his newly revealed prowess in the language of a video game aficionado.
“This is like one of the coolest video games I’ve ever gotten to play with,” he said. “And I don’t even have to put a quarter in it.”
The study’s results are the culmination of a long-running collaboration between Henderson and Shenoy and a multi-institutional consortium called BrainGate. Leigh Hochberg, MD, PhD, a neurologist and neuroscientist at Massachusetts General Hospital, Brown University and the VA Rehabilitation Research and Development Center for Neurorestoration and Neurotechnology in Providence, Rhode Island, directs the pilot clinical trial of the BrainGate system and is a study co-author.
“This incredible collaboration continues to break new ground in developing powerful, intuitive, flexible neural interfaces that we all hope will one day restore communication, mobility and independence for people with neurologic disease or injury,” said Hochberg.
Receive an email update when we add a new BRAIN-COMPUTER INTERFACE article.
The Latest on: Brain-computer interface
via Google News
The Latest on: Bain-computer interface
- New technology lets humans control animal brains wirelessly on February 16, 2019 at 2:43 am
Neuroscientists from Zhejiang University, China, have developed a technology that allows humans to control rats using the brain-brain interface. Brain-computer interfaces (BCIs) have been around for a ... […]
- China’s mind-controlled cyborg rats are proof we live in a cyberpunk dystopia on February 15, 2019 at 1:35 pm
Brain-computer interfaces (BCIs) have been around for a while now, allowing people to convey instructions to a computer using only their minds. These generally work by taking an electroencephalogram ( ... […]
- Brain-computer interface, promise of restoring communication discussed at AAAS presentation on February 15, 2019 at 1:25 pm
Brain-computer interfaces promise to restore communication for individuals with severe speech and physical impairments. Current brain computer interfaces share many features of high-tech, conventional ... […]
- Asteroid is building a human-machine interaction engine for AR developers on February 13, 2019 at 11:51 am
The $450 kit brings capabilities like eye-tracking, brain-computer interface electrodes and some gear to piece together a motion controller. Backers can also just buy the $200 eye-tracking kit alone. […]
- Mathematical monotsukuri: Summing a constant may help to detect synchronized brain activity on February 13, 2019 at 10:11 am
This could help improve brain-computer interfaces, which are meant to aid disabled people. Scientists at Tokyo Institute of Technology found a simple, yet effective, way to improve how synchronization ... […]
- We spoke to neuroscientists about the future of brain-computer interfaces on February 12, 2019 at 6:01 am
Human brains connecting to computers has long been a point of fascination for everyone from sci-fi writers and medical professionals, to Elon Musk and Mark Zuckerberg. "In some sense science fiction i... […]
- Brain Computer Interface Market forecast insights shared in detailed report on February 10, 2019 at 10:28 pm
A brain–computer interface (BCI) is a system that measures activity of the central nervous system (CNS) and converts it into artificial output that replaces, restores, enhances, supplements, or improv... […]
- The Crazy Real Science of Rooster Teeth's 'gen:LOCK' on February 6, 2019 at 7:49 am
It’s an actual window. In conversations with Inverse, a handful of experts from the fields of high-tech communications, brain-computer interfaces (BCI), and augmented and virtual realities break down ... […]
- Developing for AR: interview with Asteroid founder Sakunthala on February 5, 2019 at 5:11 pm
In a Kickstarter campaign launching today, the company will offer it alongside a hardware bundle, including an eye tracker, a 9-DOF controller, a gesture sensor, a linear scrubber, and a brain-compute... […]
via Bing News