One of the world’s first truly bionic legs

Utah Kerry Finn is one of 10 participants who tested the “Utah Bionic Leg,” a self-powered prosthetic limb with a computer processor and motorized joints in the ankle and knee that enable an amputee to walk with more power, vigor and better balance.Photo credit: Mark Helzen Draper/University of Utah College of Engineering.


For a brief time, Kerry Finn felt like “The Terminator” or “The Six Million Dollar Man.”

The 60-year-old retired truck driver from Salt Lake County, Utah, lost his left leg to vascular disease from type 2 diabetes. But last year, he was one of 10 human subjects at the University of Utah to test one of the world’s first truly bionic legs, a self-powered prosthetic limb with a computer processor and motorized joints in the ankle and knee that enable an amputee to walk with more power, vigor and better balance.

“If you’ve ever seen ‘The Terminator,’ that’s what it was like,” Finn says about the experience of testing the bionic leg over the standard prosthetic he normally uses. “It made me feel like I could do things I could not do before. Every time I made a step, it was an awesome feeling.”

University of Utah mechanical engineering assistant professor Tommaso Lenzi, who heads the project developing the “Utah Bionic Leg,” has just received two grants to further advance the technology. One is a $2.2 million award from the National Institute of Health and the other a $600,000 grant from the National Science Foundation.

Better, stronger, faster

Like the bionic limbs on fictional astronaut, Steve Austin, in the hit TV series, “The Six Million Dollar Man,” Lenzi’s Utah Bionic Leg can make amputees better, stronger and faster, though not necessarily with Austin’s strength to lift cars or run at 60 miles per hour.

Instead, Lenzi’s real bionic leg has sensors, motors, a computer processor and artificial intelligence that all work in conjunction to give the user more power to walk with less stress on the body than with a standard prosthesis. That means people with amputations, particularly elderly individuals, can walk much longer with the new leg.

“If you walk faster, it will walk faster for you and give you more energy. Or it adapts automatically to the height of the step. Or it can help you cross over obstacles,” Lenzi says.

The leg uses custom-designed force and torque sensors as well as accelerometers and gyroscopes to help determine the leg’s position in space. Those sensors are connected to a computer processor that perceives the environment and determines the user’s rhythmic motions, step length and walking speed. Based on that real-time data, it then provides power to the motors in the joints to assist in walking, standing up, walking up stairs, or maneuvering around obstacles.

“Every time you take a step, it’s powered, and it gives a certain kick. It also gives me the ability to take two steps at a time going up stairs,” Finn says. “With this leg, it’s less strain on my stump. You don’t have to work as hard. And it takes a lot of the stress off the body.”

Half the weight

Just as important, the leg is designed to be about six pounds, half the weight of other bionic legs under development, a huge benefit for a large demographic of amputees — elderly people or those who, like Finn, lost a lower limb to vascular disease.

“The people who need these bionic legs are the ones who are normally the most limited — the elderly,” Lenzi says.

While the prosthetic is made of mostly aluminum and titanium, the lightweight construction is more due to the leg’s design in which “all of the elements play together,” Lenzi says. “We have a unique way of designing the systems.”

For example, the leg uses a smart transmission system connecting the electrical motors to the joints. This optimized system intuitively knows what kind of activity the user wants to do and automatically adapts to it, like shifting gears on a bike. The leg also uses smaller batteries to power the motor that are built into the leg.

Lenzi and his team just received the government grants to research how the leg enables a user to move better and do more. The team will also be researching how the prosthetic could be designed to better anticipate a user’s movements by tracking muscle activity in the person’s residual limb.

“The ability to walk is essential to your life and being able to pursue whatever you want to do. When just standing up is a pain and when walking means being afraid of falling, you just don’t go on with your life and you are stuck at home,” Lenzi says. “This is about making bionics accessible for all people and not just those who are young and high performing.”



The Latest on: Bionic leg

via Google News


The Latest on: Bionic leg

via  Bing News


Controlling 2 prosthetic arms simultaneously with your thoughts

Matt Fifer adjusts the electrodes attached to Buz Chmielewski’s head to prepare for a round of testing. Credit: Johns Hopkins APL

Researchers from the Johns Hopkins University’s Applied Physics Laboratory (APL) and School of Medicine (SOM) have, for the first time, demonstrated simultaneous control of two of the world’s most advanced prosthetic limbs through a brain-machine interface. The team is also developing strategies for providing sensory feedback for both hands at the same time using neural stimulation.

“We are trying to enable a person with quadriplegia to use a direct neural interface to simultaneously control two assistive devices and, at the same time, feel touch sensation when the devices make contact with objects in the environment,” explained Dr. Brock Wester, a biomedical engineer and APL’s principal investigator for the study.

“It has significant implications for restoring capabilities to patients with high spinal cord injuries and neuromuscular diseases” he continued. “For everything we envision people needing or wanting to do to become independent — tie their shoes, catch and throw a ball, squeeze toothpaste onto a toothbrush — they really need two hands working together.”

These breakthroughs are the latest developments in Revolutionizing Prosthetics (RP), a program launched by the Defense Advanced Research Projects Agency in 2006 to rapidly improve upper-extremity prosthetic technologies and provide new means for users to operate them.

The original vision of the RP program was to create a neurally integrated prosthetic upper limb with human-like capabilities; this resulted in the Modular Prosthetic Limb (MPL). “As we integrated new capabilities into the MPL, such as fingertip sensors for force, acceleration, slip and pressure, we started to ask ourselves, ‘what is the best way to feed this information back to our study participants so that they would be able to interact with the environment just as able-bodied people do?’” said Dr. Francesco Tenore, APL’s project manager for this effort.

In addition to developing the MPL, program researchers have been exploring the use of neural signals to enable “real time” control of prosthetic and intelligent systems. The program’s initial neural control studies with participants at the University of Pittsburgh and the California Institute of Technology/Rancho Los Amigos focused on the control of a single limb, which three participants were able to do after months of training. This success highlighted the possibilities of neuroprosthetics and laid the groundwork for future studies.

APL is working with two research groups at the Johns Hopkins Hospital: Dr. Pablo Celnik’s team in Physical Medicine and Rehabilitation and Dr. Nathan Crone’s team in the Department of Neurology. Read more about this collaboration.

In January, in a first-of-its-kind surgery, Dr. Stan Anderson’s team at Johns Hopkins implanted intracortical microelectrode array sensors on both sides of a patient’s brain, in the regions that control movement and touch sensation. As part of the surgery, APL researchers and Crone’s team pioneered a method to identify the best locations for placing the electrodes using real-time mapping of brain activity during the surgery.

The research team has completed several assessments of the neural signals acquired from the motor and sensory areas of the brain, and they’ve studied what the patient feels when the hand areas of his brain are stimulated. The results from these experiments highlight the potential for patients to sense more information about the prosthetic limb or the environment with which they are interacting.

With these tests and the successful surgery, the team has already tallied several “firsts” in the field of brain-machine interfaces.

“For the first time, our team has been able to show a person’s ability to ‘feel’ brain stimulation delivered to both sides of the brain at the same time. We showed how stimulation of left and right finger areas in the brain could be successfully controlled by physical touch to the MPL fingers,” explained APL’s Dr. Matthew Fifer, the technical lead on the project. This study benefits from the world’s first human bilateral implant for recoding and stimulation, including 96 electrodes that can be used to deliver very focused neural stimulation to the finger areas of the brain.

“Ultimately, because this is the world’s first bilateral implant, we want to be able to execute motions that require both arms and allow the user to perceive interactions with the environment as though they were coming from his own hands,” Tenore said. “Our team will continue training with our participant to develop motor and sensory capabilities, as well as to explore the potential for control of other devices that could be used to expand a user’s personal or professional capabilities.”

“These developments are critical components necessary for future brain-machine interface technologies — relevant to spinal cord injury, stroke, Lou Gehrig’s disease, among others — all aiming to restore human functions,” said Dr. Adam Cohen, Health Technologies program manager in APL’s National Health Mission Area.

Learn more: In a First, Patient Controls Two Prosthetic Arms with His Thoughts


The Latest on: Prosthetic limbs

via Google News


The Latest on: Prosthetic limbs

via  Bing News


Wearable electronics gain a boost from solar-powered supercapacitors

via University of Glasgow

A breakthrough in energy storage technology could bring a new generation of flexible electronic devices to life, including solar-powered prosthetics for amputees.

In a new paper published in the journal Advanced Science, a team of engineers from the University of Glasgow discuss how they have used layers of graphene and polyurethane to create a flexible supercapacitor which can generate power from the sun and store excess energy for later use.

They demonstrate the effectiveness of their new material by powering a series of devices, including a string of 84 power-hungry LEDs and the high-torque motors in a prosthetic hand, allowing it to grasp a series of objects.

The research towards energy autonomous e-skin and wearables is the latest development from the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) research group, led by Professor Ravinder Dahiya.

The top touch sensitive layer developed by the BEST group researchers is made from graphene, a highly flexible, transparent ‘super-material’ form of carbon layers just one atom thick.

Sunlight which passes through the top layer of graphene is used to generate power via a layer of flexible photovoltaic cells below. Any surplus power is stored in a newly-developed supercapacitor, made from a graphite-polyurethane composite.

The team worked to develop a ratio of graphite to polyurethane which provides a relatively large, electroactive surface area where power-generating chemical reactions can take place, creating an energy-dense flexible supercapacitor which can be charged and discharged very quickly.

Similar supercapacitors developed previously have delivered voltages of one volt or less, making single supercapacitors largely unsuited for powering many electronic devices. The team’s new supercapacitor can deliver 2.5 volts, making it more suited for many common applications.

In laboratory tests, the supercapacitor has been powered, discharged and powered again 15,000 times with no significant loss in its ability to store the power it generates.

Professor Ravinder Dahiya, Professor of Electronics and Nanoengineering at the University of Glasgow’s School of Engineering, who led this research said: “This is the latest development in a string of successes we’ve had in creating flexible, graphene based devices which are capable of powering themselves from sunlight.

“Our previous generation of flexible e-skin needed around 20 nanowatts per square centimetre for its operation, which is so low that we were getting surplus energy even with the lowest-quality photovoltaic cells on the market.

“We were keen to see what we could do to capture that extra energy and store it for use at a later time, but we weren’t satisfied with current types of energy storages devices such as batteries to do the job, as they are often heavy, non-flexible, prone to getting hot, and slow to charge.

“Our new flexible supercapacitor, which is made from inexpensive materials, takes us some distance towards our ultimate goal of creating entirely self-sufficient flexible, solar-powered devices which can store the power they generate.

“There’s huge potential for devices such as prosthetics, wearable health monitors, and electric vehicles which incorporate this technology, and we’re keen to continue refining and improving the breakthroughs we’ve made already in this field.”




The Latest on: Solar-powered supercapacitors

via Google News


The Latest on: Solar-powered supercapacitors

via  Bing News


A new prosthetic limb allows the wearer to reach for objects automatically, without thinking – just like a real hand

Hand that sees

Hand that sees offers new hope to amputees.

Led by biomedical engineers at Newcastle University and funded by the Engineering and Physical Sciences Research Council (EPSRC), the bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand.

Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand ‘sees’ and reacts in one fluid movement.

A small number of amputees have already trialled the new technology and now the Newcastle University team are working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the ‘hands with eyes’ to patients at Newcastle’s Freeman Hospital.

A hand which can respond automatically

Publishing their findings today in the Journal of Neural Engineering, co-author on the study Dr Kianoush Nazarpour, a Senior Lecturer in Biomedical Engineering at Newcastle University, explains:

“Prosthetic limbs have changed very little in the past 100 years – the design is much better and the materials’ are lighter weight and more durable but they still work in the same way.

“Using computer vision, we have developed a bionic hand which can respond automatically – in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.

“Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison.

“Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking.”

Video: All-seeing hand
Artificial vision for artificial hands

Recent statistics show that in the UK there are around 600 new upper-limb amputees every year, of which 50% are in the age range of 15-54 years old. In the US there are 500,000 upper limb amputees a year.

Current prosthetic hands are controlled via myoelectric signals – that is electrical activity of the muscles recorded from the skin surface of the stump.

Controlling them, says Dr Nazarpour, takes practice, concentration and, crucially, time.

Using neural networks – the basis for Artificial Intelligence – lead author on the study Ghazal Ghazaei showed the computer numerous object images and taught it to recognise the ‘grip’ needed for different objects.

“We would show the computer a picture of, for example, a stick,” explains Miss Ghazaei, who carried out the work as part of her PhD in the School of Electrical and Electronic Engineering at Newcastle University.  “But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up.

“So the computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up.

“It is this which enables it to accurately assess and pick up an object which it has never seen before – a huge step forward in the development of bionic limbs.”

Grouping objects by size, shape and orientation, according to the type of grasp that would be needed to pick them up, the team programmed the hand to perform four different ‘grasps’: palm wrist neutral (such as when you pick up a cup); palm wrist pronated (such as picking up the TV remote); tripod (thumb and two fingers) and pinch (thumb and first finger).

Using a 99p camera fitted to the prosthesis, the hand ‘sees’ an object, picks the most appropriate grasp and sends a signal to the hand – all within a matter of milliseconds and ten times faster than any other limb currently on the market.

“One way would have been to create a photo database of every single object but clearly that would be a massive task and you would literally need every make of pen, toothbrush, shape of cup – the list is endless,” says Dr Nazarpour.

“The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects – which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before.”

Video: Grasp classification in myoelectric hands
First step towards a fully connected bionic hand

The work is part of a larger research project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain.

Led by Newcastle University and involving experts from the universities of Leeds, Essex, Keele, Southampton and Imperial College London, the aim is to develop novel electronic devices that connect to the forearm neural networks to allow two-way communications with the brain.

Reminiscent of Luke Skywalker’s artificial hand, the electrodes in the bionic limb would wrap around the nerve endings in the arm.  This would mean for the first time the brain could communicate directly with the prosthesis.

The ‘hand that sees’, explains Dr Nazarpour, is an interim solution that will bridge the gap between current designs and the future.

It’s a stepping stone towards our ultimate goal,” he says.  “But importantly, it’s cheap and it can be implemented soon because it doesn’t require new prosthetics – we can just adapt the ones we have.”

Anne Ewing, Advanced Occupational Therapist at Newcastle upon Tyne Hospitals NHS Foundation Trust, has been working with Dr Nazarpour and his team.

“I work with upper limb amputee patients which is extremely rewarding, varied and at times challenging,” she said.

“We always strive to put the patient at the heart of everything we do and so make sure that any interventions are client centred to ensure patients’ individual goals are met either with a prosthesis or alternative method of carrying out a task.

“This project in collaboration with Newcastle University has provided an exciting opportunity to help shape the future of upper limb prosthetics, working towards achieving patients’ prosthetic expectations and it is wonderful to have been involved.”

Case Study – Doug McIntosh, 56

“For me it was literally a case of life or limb,” says Doug McIntosh, who lost his right arm in 1997 through cancer.

“I had developed a rare form of cancer called epithelial sarcoma, which develops in the deep tissue under the skin, and the doctors had no choice but to amputate the limb to save my life.

“Losing an arm and battling cancer with three young children was life changing.  I left my job as a life support supervisor in the diving industry and spent a year fund-raising for cancer charities.

“It was this and my family that motivated me and got me through the hardest times.”

Since then, Doug has gone on to be an inspiration to amputees around the world.  Becoming the first amputee to cycle from John O’Groats to Land’s End in 100hrs, cycle around the coast line of Britain, he has run three London Marathons, cycled The Dallaglio Flintoff Cycle Slam 2012 and 2014 and in 2014 cycled with the British Lions Rugby Team to Murrayfield Rugby Stadium for “Walking with Wounded” Charity.  He is currently preparing to do Mont Ventoux this September, three cycle climbs in one day for Cancer Research UK and Maggie’s Cancer Centres.

Involved in the early trials of the first myoelectric prosthetic limbs, Doug has been working with the Newcastle team to trail the new hand that sees.

“The problem is there’s nothing yet that really comes close to feeling like the real thing,” explains the father-of-three who lives in Westhill, Aberdeen with his wife of 32 years, Diane.

“Some of the prosthetics look very realistic but they feel slow and clumsy when you have a working hand to compare them to.

“In the end I found it easier just to do without and learn to adapt.  When I do use a prosthesis I use a split hook which doesn’t look pretty but does the job.”

But he says the new, responsive hand being developed in Newcastle is a ‘huge leap forward’.

“This offers for the first time a real alternative for upper limb amputees,” he says.

“For me, one of the ways of dealing with the loss of my hand was to be very open about it and answer people’s questions.  But not everyone wants that and so to have the option of a hand that not only looks realistic but also works like a real hand would be an amazing breakthrough and transform the recovery time – both physically and mentally – for many amputees.”

The Latest on: Prosthetic limbs

via Google News and Bing News

By Restoring a Sense of Touch to Amputees, HAPTIX Seeks to Overcome Physical and Psychological Effects of Upper Limb Loss


Providing a direct and powerful link between user intent and prosthesis control

To understand the meaning of “proprioception,” try a simple experiment. Close your eyes and lift your right arm above your head. Then, move it down so that it’s parallel to the ground. Make a fist and release it. Move it forward, and then swing it around behind you like you’re stretching. Finally, freeze in place, open your eyes, and look. Is your arm positioned where you thought it would be?

For most people, the answer will be, “Yes.” That’s because your brain and nervous system worked together to move your body according to your intent and processed the sensory feedback to know where your arm was in space despite not being able to visually track it.

For many upper-limb amputees using prosthetic devices, the answer would be, “No.” They wouldn’t have confidence that their device would be where they think it is because current prostheses lack provisions for providing complex tactile and proprioceptive feedback to the user. Without this feedback, even the most advanced prosthetic limbs will remain numb to the user and manipulation functions will be impaired.

DARPA’s new Hand Proprioception and Touch Interfaces (HAPTIX) program seeks to deliver those kinds of naturalistic sensations to amputees, and in the process, enable intuitive, dexterous control of advanced prosthetic devices that substitute for amputated limbs, provide the psychological benefit of improving prosthesis “embodiment,” and reduce phantom limb pain. The program builds on neural-interface technologies advanced during DARPA’s Revolutionizing Prosthetics and Reliable Neural-Interface Technology (RE-NET) programs that made major steps forward in providing a direct and powerful link between user intent and prosthesis control.

HAPTIX aims to achieve its goals by developing interface systems that measure and decode motor signals recorded in peripheral nerves and/or muscles. The program will adapt one of the advanced prosthetic limb systems developed under Revolutionizing Prosthetics to incorporate sensors that provide tactile and proprioceptive feedback to the user, delivered through patterned stimulation of sensory pathways in the peripheral nerve. One of the key challenges will be to identify stimulation patterning strategies that elicit naturalistic sensations of touch and movement. The ultimate goal is to create a fully-implantable device that is safe, reliable, effective, and approved for human use.

“Peripheral nerves are information-rich and readily accessible targets for interfacing with the human nervous system. Research performed under DARPA’s RE-NET program and elsewhere showed that these nerves maintain motor and sensory fibers that previously innervated the amputated limb, and that these fibers remain functional for decades after limb loss,” said Doug Weber, the DARPA program manager. “HAPTIX will try to tap in to these biological communication pathways so that users can control and sense the prosthesis via the same neural signaling pathways used for intact hands and arms.”

In addition to the improved motor performance that restored touch and proprioception would convey to the user, mounting evidence suggests that sensory stimulation in amputees may provide important psychological benefits such as improving prosthesis “embodiment” and reducing the phantom limb pain that is suffered by approximately 80 percent of amputees. For this reason, DARPA seeks the inclusion of psychologists in the multi-disciplinary teams of scientists, engineers, and clinicians proposing to develop the electrodes, algorithms, and electronics technology components for the HAPTIX system. Teams will need to consider how the use of HAPTIX system may impact the user in several important domains including motor and sensory function, psychology, pain, and quality of life.

“We have the opportunity to not only significantly improve an amputee’s ability to control a prosthetic limb, but to make a profound, positive psychological impact,” Weber said. “Amputees view existing prostheses as if they were tools, like a wrench, used only to perform a specific job, so many people abandon their prostheses unless absolutely needed. We believe that HAPTIX will create a sensory experience so rich and vibrant that the user will want to wear his or her prosthesis full-time and accept it as a natural extension of the body. If we can achieve that, DARPA is even closer to fulfilling its commitment to help restore full and natural functionality to wounded service members.”

The program plan culminates with a 12-month, take-home trial of the complete HAPTIX prosthesis system. To aid performers in the completion of the steps necessary to achieve regulatory approvals for human trials, DARPA consulted with the U.S Food and Drug Administration to incorporate regulatory timelines into the program process.

“Once development of the HAPTIX system is complete, we want people to benefit immediately and be able to use their limb all day, every day, and in every aspect of their lives,” Weber said. “The experience needs to be comfortable and easy. Take-home trials are the first step in making that vision a reality.”

If it is successful, the HAPTIX program will create fully-implantable, modular, and reconfigurable neural-interface microsystems that communicate wirelessly with external modules, such as the prosthesis interface link. Because such technology would have broad application and could fuel future medical devices, HAPTIX also plans to fund teams to pursue the science and technology that would support next-generation HAPTIX capabilities.

Read more . . .


The Latest on: HAPTIX

via Google News


The Latest on: HAPTIX

via  Bing News