Controlling 2 prosthetic arms simultaneously with your thoughts

Matt Fifer adjusts the electrodes attached to Buz Chmielewski’s head to prepare for a round of testing. Credit: Johns Hopkins APL

Researchers from the Johns Hopkins University’s Applied Physics Laboratory (APL) and School of Medicine (SOM) have, for the first time, demonstrated simultaneous control of two of the world’s most advanced prosthetic limbs through a brain-machine interface. The team is also developing strategies for providing sensory feedback for both hands at the same time using neural stimulation.

“We are trying to enable a person with quadriplegia to use a direct neural interface to simultaneously control two assistive devices and, at the same time, feel touch sensation when the devices make contact with objects in the environment,” explained Dr. Brock Wester, a biomedical engineer and APL’s principal investigator for the study.

“It has significant implications for restoring capabilities to patients with high spinal cord injuries and neuromuscular diseases” he continued. “For everything we envision people needing or wanting to do to become independent — tie their shoes, catch and throw a ball, squeeze toothpaste onto a toothbrush — they really need two hands working together.”

These breakthroughs are the latest developments in Revolutionizing Prosthetics (RP), a program launched by the Defense Advanced Research Projects Agency in 2006 to rapidly improve upper-extremity prosthetic technologies and provide new means for users to operate them.

The original vision of the RP program was to create a neurally integrated prosthetic upper limb with human-like capabilities; this resulted in the Modular Prosthetic Limb (MPL). “As we integrated new capabilities into the MPL, such as fingertip sensors for force, acceleration, slip and pressure, we started to ask ourselves, ‘what is the best way to feed this information back to our study participants so that they would be able to interact with the environment just as able-bodied people do?’” said Dr. Francesco Tenore, APL’s project manager for this effort.

In addition to developing the MPL, program researchers have been exploring the use of neural signals to enable “real time” control of prosthetic and intelligent systems. The program’s initial neural control studies with participants at the University of Pittsburgh and the California Institute of Technology/Rancho Los Amigos focused on the control of a single limb, which three participants were able to do after months of training. This success highlighted the possibilities of neuroprosthetics and laid the groundwork for future studies.

APL is working with two research groups at the Johns Hopkins Hospital: Dr. Pablo Celnik’s team in Physical Medicine and Rehabilitation and Dr. Nathan Crone’s team in the Department of Neurology. Read more about this collaboration.

In January, in a first-of-its-kind surgery, Dr. Stan Anderson’s team at Johns Hopkins implanted intracortical microelectrode array sensors on both sides of a patient’s brain, in the regions that control movement and touch sensation. As part of the surgery, APL researchers and Crone’s team pioneered a method to identify the best locations for placing the electrodes using real-time mapping of brain activity during the surgery.

The research team has completed several assessments of the neural signals acquired from the motor and sensory areas of the brain, and they’ve studied what the patient feels when the hand areas of his brain are stimulated. The results from these experiments highlight the potential for patients to sense more information about the prosthetic limb or the environment with which they are interacting.

With these tests and the successful surgery, the team has already tallied several “firsts” in the field of brain-machine interfaces.

“For the first time, our team has been able to show a person’s ability to ‘feel’ brain stimulation delivered to both sides of the brain at the same time. We showed how stimulation of left and right finger areas in the brain could be successfully controlled by physical touch to the MPL fingers,” explained APL’s Dr. Matthew Fifer, the technical lead on the project. This study benefits from the world’s first human bilateral implant for recoding and stimulation, including 96 electrodes that can be used to deliver very focused neural stimulation to the finger areas of the brain.

“Ultimately, because this is the world’s first bilateral implant, we want to be able to execute motions that require both arms and allow the user to perceive interactions with the environment as though they were coming from his own hands,” Tenore said. “Our team will continue training with our participant to develop motor and sensory capabilities, as well as to explore the potential for control of other devices that could be used to expand a user’s personal or professional capabilities.”

“These developments are critical components necessary for future brain-machine interface technologies — relevant to spinal cord injury, stroke, Lou Gehrig’s disease, among others — all aiming to restore human functions,” said Dr. Adam Cohen, Health Technologies program manager in APL’s National Health Mission Area.

Learn more: In a First, Patient Controls Two Prosthetic Arms with His Thoughts

 

The Latest on: Prosthetic limbs

via  Bing News

 

Merging user and robotic control in a smart artificial hand for amputees

via EPFL

EPFL scientists have successfully tested new neuroprosthetic technology that combines robotic control with users’ voluntary control, opening avenues in the new interdisciplinary field of shared control for neuroprosthetic technologies.

EPFL scientists are developing new approaches for improved control of robotic hands – in particular for amputees – that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects. The results are published in today’s issue of Nature Machine Intelligence.

The technology merges two concepts from two different fields. Implementing them both together had never been done before for robotic hand control, and contributes to the emerging field of shared control in neuroprosthetics.

One concept, from neuroengineering, involves deciphering intended finger movement from muscular activity on the amputee’s stump for individual finger control of the prosthetic hand which has never before been done. The other, from robotics, allows the robotic hand to help take hold of objects and maintain contact with them for robust grasping.

“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping. ”

How shared control works

The algorithm first learns how to decode user intention and translates this into finger movement of the prosthetic hand. The amputee must perform a series of hand movements in order to train the algorithm that uses machine learning. Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity. Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand.

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” says Katie Zhuang first author of the publication.

Next, the scientists engineered the algorithm so that robotic automation kicks in when the user tries to grasp an object. The algorithm tells the prosthetic hand to close its fingers when an object is in contact with sensors on the surface of the prosthetic hand. This automatic grasping is an adaptation from a previous study for robotic arms designed to deduce the shape of objects and grasp them based on tactile information alone, without the help of visual signals.

Many challenges remain to engineer the algorithm before it can be implemented in a commercially available prosthetic hand for amputees. For now, the algorithm is still being tested on a robot provided by an external party.

“Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” says Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna in Italy.

Learn more: A smart artificial hand for amputees merges user and robotic control

 

The Latest on: Neuroprosthetic technology

via  Bing News

 

Controlling a robotic arm with your mind

Nicholas Hatsopoulos, PhD, professor in the department of Organismal Biology and Anatomy, in his Culver Hall lab Thursday, Feb. 12, 2015, on the University of Chicago campus. (Photo by Jean Lachat)

A new study by neuroscientists at the University of Chicago shows how amputees can learn to control a robotic arm through electrodes implanted in the brain.

The research, published in Nature Communications, details changes that take place in both sides of the brain used to control the amputated limb and the remaining, intact limb. The results show both areas can create new connections to learn how to control the device, even several years after an amputation.

“That’s the novel aspect to this study, seeing that chronic, long-term amputees can learn to control a robotic limb,” said Nicho Hatsopoulos, PhD, professor of organismal biology and anatomy at UChicago and senior author of the study. “But what was also interesting was the brain’s plasticity over long-term exposure, and seeing what happened to the connectivity of the network as they learned to control the device.”

Previous experiments have shown how paralyzed human patients can move robotic limbs through a brain machine interface. The new study is one of the first to test the viability of these devices in amputees as well.

The researchers worked with three rhesus monkeys who suffered injuries at a young age and had to have an arm amputated to rescue them four, nine and 10 years ago, respectively. Their limbs were not amputated for the purposes of the study. In two of the animals, the researchers implanted electrode arrays in the side of the brain opposite, or contralateral, to the amputated limb. This is the side that used to control the amputated limb. In the third animal, the electrodes were implanted on the same side, or ipsilateral, to the amputated limb. This is the side that still controlled the intact limb.

robotic arm

Monkeys were trained to use their thoughts to move a robotic arm and grasp a ball

The monkeys were then trained (with generous helpings of juice) to move a robotic arm and grasp a ball using only their thoughts. The scientists recorded the activity of neurons where the electrodes were placed, and used a statistical model to calculate how the neurons were connected to each other before the experiments, during training and once the monkeys mastered the activity.

The connections between neurons on the contralateral side—the side that had been controlling the amputated arm—were sparse before the training, most likely because they had not been used for that function in a long time. But as training progressed, these connections became more robust and dense in areas used for both reaching and grasping.

On the ipsilateral side—the side that had been controlling the monkey’s intact arm—the connections were dense at the beginning of the experiments. But the researchers saw something interesting as training progressed: first the connections were pruned and the networks thinned, before rebuilding into a new, dense network.

“That means connections were shedding off as the animal was trying to learn a new task, because there is already a network controlling some other behavior,” said Karthikeyan Balasubramanian, PhD, a postdoctoral researcher who led the study. “But after a few days it started rebuilding into a new network that can control both the intact limb and the neuroprosthetic.”

BMI Network Diagram

Comparison of network density between the contralateral and ipsilateral monkeys. When using the brain machine interface, the contralateral monkey showed a steady increase in the network connectivity. On the contrary, the ipsilateral monkey showed an initial pruning before having a steady increase in the network density. Each node in the diagram corresponds to a neuron (R – reach and G – grasp neurons).?

Now the team plans to continue their work by combining it with research by other groups to equip neuroprosthetic limbs with sensory feedback about touch and proprioception, which is the sense of where the limb is located in space.

“That’s how we can begin to create truly responsive neuroprosthetic limbs, when people can both move it and get natural sensations through the brain machine interface,” Hatsopoulos said.

Learn more: Amputees can learn to control a robotic arm with their minds

 

The Latest on: Neuroprosthetics

via Google News and Bing News

Neurotechnology Provides Near-Natural Sense of Touch

Revolutionizing Prosthetics

Revolutionizing Prosthetics

Revolutionizing Prosthetics program achieves goal of restoring sensation

A 28-year-old who has been paralyzed for more than a decade as a result of a spinal cord injury has become the first person to be able to “feel” physical sensations through a prosthetic hand directly connected to his brain, and even identify which mechanical finger is being gently touched.

The advance, made possible by sophisticated neural technologies developed under DARPA’s Revolutionizing Prosthetics points to a future in which people living with paralyzed or missing limbs will not only be able to manipulate objects by sending signals from their brain to robotic devices, but also be able to sense precisely what those devices are touching.

“We’ve completed the circuit,” said DARPA program manager Justin Sanchez. “Prosthetic limbs that can be controlled by thoughts are showing great promise, but without feedback from signals traveling back to the brain it can be difficult to achieve the level of control needed to perform precise movements. By wiring a sense of touch from a mechanical hand directly into the brain, this work shows the potential for seamless bio-technological restoration of near-natural function.”

The clinical work involved the placement of electrode arrays onto the paralyzed volunteer’s sensory cortex—the brain region responsible for identifying tactile sensations such as pressure. In addition, the team placed arrays on the volunteer’s motor cortex, the part of the brain that directs body movements.

Wires were run from the arrays on the motor cortex to a mechanical hand developed by the Applied Physics Laboratory (APL) at Johns Hopkins University. That gave the volunteer—whose identity is being withheld to protect his privacy—the capacity to control the hand’s movements with his thoughts, a feat previously accomplished under the DARPA program by another person with similar injuries.

Then, breaking new neurotechnological ground, the researchers went on to provide the volunteer a sense of touch. The APL hand contains sophisticated torque sensors that can detect when pressure is being applied to any of its fingers, and can convert those physical “sensations” into electrical signals. The team used wires to route those signals to the arrays on the volunteer’s brain.

In the very first set of tests, in which researchers gently touched each of the prosthetic hand’s fingers while the volunteer was blindfolded, he was able to report with nearly 100 percent accuracy which mechanical finger was being touched. The feeling, he reported, was as if his own hand were being touched.

“At one point, instead of pressing one finger, the team decided to press two without telling him,” said Sanchez, who oversees the Revolutionizing Prosthetics program. “He responded in jest asking whether somebody was trying to play a trick on him. That is when we knew that the feelings he was perceiving through the robotic hand were near-natural.”

Sanchez described the basic findings on Thursday at Wait, What? A Future Technology Forum, hosted by DARPA in St. Louis. Further details about the work are being withheld pending peer review and acceptance for publication in a scientific journal.

The restoration of sensation with implanted neural arrays is one of several neurotechnology-based advances emerging from DARPA’s 18-month-old Biological Technologies Office, Sanchez said. “DARPA’s investments in neurotechnologies are helping to open entirely new worlds of function and experience for individuals living with paralysis and have the potential to benefit people with similarly debilitating brain injuries or diseases,” he said.

Read more: Neurotechnology Provides Near-Natural Sense of Touch

 

 

The Latest on: Neurotechnology

via  Bing News

 

UGA’s Regenerative Bioscience Center collaborates in development of brain-friendly interfaces

Evoked neuronal signals in response to whisker stimulation were recorded using ECM electrode implants in the rat barrel cortex. Initial extended-time neural recording studies suggest that the electrodes maintained their recording capability over a five-week time period and were stable over four weeks after implantation.

Evoked neuronal signals in response to whisker stimulation were recorded using ECM electrode implants in the rat barrel cortex. Initial extended-time neural recording studies suggest that the electrodes maintained their recording capability over a five-week time period and were stable over four weeks after implantation.

Recent research published in the journal Microsystems & Nanoengineering could eventually change the way people living with prosthetics and spinal cord injury lead their lives.

Instead of using neural prosthetic devices—which suffer from immune-system rejection and are believed to fail due to a material and mechanical mismatch—a multi-institutional team, including Lohitash Karumbaiah of the University of Georgia’s Regenerative Bioscience Center, has developed a brain-friendly extracellular matrix environment of neuronal cells that contain very little foreign material. These by-design electrodes are shielded by a covering that the brain recognizes as part of its own composition.

Although once believed to be devoid of immune cells and therefore of immune responses, the brain is now recognized to have its own immune system that protects it against foreign invaders.

“This is not by any means the device that you’re going to implant into a patient,” said Karumbaiah, an assistant professor of animal and dairy science in the UGA College of Agricultural and Environmental Sciences. “This is proof of concept that extracellular matrix can be used to ensheathe a functioning electrode without the use of any other foreign or synthetic materials.”

Implantable neural prosthetic devices in the brain have been around for almost two decades, helping people living with limb loss and spinal cord injury become more independent. However, not only do neural prosthetic devices suffer from immune-system rejection, but most are believed to eventually fail because of a mismatch between the soft brain tissue and the rigid devices.

The collaboration, led by Wen Shen and Mark Allen of the University of Pennsylvania, found that the extracellular matrix derived electrodes adapted to the mechanical properties of brain tissue and were capable of acquiring neural recordings from the brain cortex.

“Neural interface technology is literally mind boggling, considering that one might someday control a prosthetic limb with one’s own thoughts,” Karumbaiah said.

The study’s joint collaborators were Ravi Bellamkonda, who conceived the new approach and is chair of the Wallace H. Coulter Department of Biomedical Engineering at the Georgia Institute of Technology and Emory University, as well as Allen, who at the time was director of the Institute for Electronics and Nanotechnology.

“Hopefully, once we converge upon the nanofabrication techniques that would enable these to be clinically translational, this same methodology could then be applied in getting these extracellular matrix derived electrodes to be the next wave of brain implants,” Karumbaiah said.

Currently, one out of every 190 Americans is living with limb loss, according to the National Institutes of Health. There is a significant burden in cost of care and quality of life for people suffering from this disability.

Read more: UGA’s Regenerative Bioscience Center collaborates in development of brain-friendly interfaces

 

The Latest on: Neuroprosthetics

via  Bing News