A collaborative study by researchers at Tokyo Institute of Technology has developed a new technique to decode motor intention of humans from Electroencephalography (EEG).
This technique is motivated by the well documented ability of the brain to predict sensory outcomes of self-generated and imagined actions utilizing so called forward models. The method enabled for the first time, nearly 90% single trial decoding accuracy across tested subjects, within 96 ms of the stimulation, with zero user training, and with no additional cognitive load on the users.
The ultimate dream of brain computer interface (BCI) research is to develop an efficient connection between machines and the human brain, such that the machines may be used at will. For example, enabling an amputee to use a robot arm attached to him, just by thinking of it, as if it was his own arm. A big challenge for such a task is the deciphering of a human user’s movement intention from his brain activity, while minimizing the user effort. While a plethora of methods have been suggested for this in the last two decades (1-2), they all require large effort in part of the human user- they either require extensive user training, work well with only a section of the users, or need to use a conspicuous stimulus, inducing additional attentional and cognitive loads on the users. In this study, Researchers from Tokyo Institute of Technology (Tokyo Tech), Le Centre national de la recherche scientifique (CNRS-France), AIST and Osaka University propose a new movement intention decoding philosophy and technique that overcomes all these issues while providing equally much better decoding performance.
The fundamental difference between the previous methods and what they propose is in what is decoded. All the previous methods decode what movement a user intends/imagines, either directly (as in the so called active BCI systems) or indirectly, by decoding what he is attending to (like the reactive BCI systems). Here the researchers propose to use a subliminal sensory stimulator with the Electroencephalography (EEG), and decode, not what movement a user intends/imagines, but to decode whether the movement he intends matches (or not) the sensory feedback sent to the user using the stimulator. Their proposal is motivated by the multitude of studies on so called Forward models in the brain; the neural circuitry implicated in predicting sensory outcomes of self-generated movements (3). The sensory prediction errors, between the forward model predictions and the actual sensory signals, are known to be fundamental for our sensory-motor abilities- for haptic perception (4), motor control (5), motor learning (6), and even inter-personal interactions (7-8) and the cognition of self (9). The researchers therefore hypothesized the predictions errors to have a large signature in EEG, and perturbing the prediction errors (using an external sensory stimulator) to be a promising way to decode movement intentions.
This proposal was tested in a binary simulated wheelchair task, in which users thought of turning their wheelchair either left or right. The researchers stimulated the user’s vestibular system (as this is the dominant sensory feedback during turning), towards either the left or right direction, subliminally using a galvanic vestibular stimulator. They then decode for the presence of prediction errors (ie. whether or stimulation direction matches the direction the user imagines, or not) and consequently, as the direction of stimulation is known, the direction the user imagines. This procedure provides excellent single trial decoding accuracy (87.2% median) in all tested subjects, and within 96 ms of stimulation. These results were obtained with zero user training and with no additional cognitive load on the users, as the stimulation was subliminal.
- Figure 2.
- Decoding performance summary:The across subject median decoding performance when decoding for the direction in which a subject wants to turn (ie. the cue direction), as has been tried in previous methods, is shown in red and pink, while decoding using the new proposed method is shown in black. The data at each time point represents the decoding performance using data from the time period between a reference point (?cue’ for red data, and ?GVS start’ for pink and black data) and that time point. Box plot boundaries represent the 25th and 75th percentile, while the whiskers represent the data range across subjects. The inset histograms shows the subject ensemble decoding performance in the 140 (twenty X7 subjects) test trials, with each subject data shown in a different color.
This proposal promises to radically change how movement intention is decoded, due to several reasons. Primarily, because the method promises better decoding accuracies with no user training and without inducing additional cognitive loads on the users. Furthermore, the fact that the decoding can be done in less than 100 ms of the stimulation highlights its use for real-time decoding. Finally, this method is distinct from other methods utilizing ERP, ERD and ERN, showing that it can be used in parallel to current methods to improve their accuracy.
Receive an email update when we add a new BRAIN COMPUTER INTERFACE article.
The Latest on: Brain computer interface
via Google News
The Latest on: Brain computer interface
- Global Next Generation Computing Market 2018-2023: Bio-Sciences/Healthcare, Financial Services, and Energy Sector will be Leading Verticals Globally on November 12, 2018 at 9:22 am
The "Next Generation Computing Market: Bio-Computing, Brain-Computer Interfaces, High Performance Computing, Nanocomputing, Neomorphic Computing, Serverless Computing, Swarm Computing, and Quantum ... […]
- You may soon be able to control TV with your brain! Samsung's working on it on November 12, 2018 at 4:47 am
Besides, neuroscientists the world over have been researching ways to make a digital interface for the brain. So although Brain-Computer Interface technology is still very nascent, it might one day re... […]
- Brain Computer Interface Market Growing Demand 2017 to 2024 on November 12, 2018 at 4:19 am
According to the global Brain Computer Interface Market report published by Value Market Research, the market is expected to touch USD 1,822.2 Million by 2024, with a CAGR of 9.9% growing from USD 944 ... […]
- Global Next Generation Computing Market 2018-2023: High Performance Computing as a Service will Account for 43% of Total the HPC Market on November 12, 2018 at 4:15 am
188.8.131.52 Exascale Computing Driven HPC Market by Segment 184.108.40.206 Exascale Computing Driven HPC Market by Hardware Type 220.127.116.11 Exascale Computing Driven HPC Market by Service Type 18.104.22.168 Exascale ... […]
- University Of South Florida To Host First-Ever International Brain-Drone Race on November 11, 2018 at 11:36 pm
The underlying technology is part of a much larger field of study into brain-computer interfaces (BCI) – a term first coined in the 1970s. BCIs are devices that create a pathway between the brain and ... […]
- A wearable brain-computer interface on November 7, 2018 at 3:09 am
Tonya Hall and David Blodgett, chief scientist of the research and exploratory development department at Johns Hopkins University Applied Physics Lab, talk about creating a wearable brain-computer int... […]
- Brain Computer Interface (BCI) Market Share,Trends,Business Strategy and Forecast to 2025 on November 6, 2018 at 4:12 pm
6. Advanced Brain Monitoring, Inc. 7. Natus Medical Incorporated. 8. OpenBCI 9. Cadwell Industries, Inc. 10. Cortech Solutions, Inc. Worldwide Brain Computer Interface (BCI) Market Analysis to 2025 is ... […]
- Brain-computer interface advances improve prosthetics, therapies on November 6, 2018 at 9:05 am
SAN DIEGO -- Advances in connecting neural stimulation to physical control of the body are transforming the development of prosthetics and therapeutic training for people with disabilities, according ... […]
- Can Brain-Machine Interfaces Put You in a Better Mood? on November 5, 2018 at 8:44 pm
Dr. Jeffrey Martin checks the computer read-out while Dr. Manchanda ... In Silicon Valley, right now, brain-machine interfaces are having a moment. If you've got a PhD in cognitive science ... […]
via Bing News