For a newborn giraffe or wildebeest, being born can be a perilous introduction to the world—predators lie in wait for an opportunity to make a meal of the herd’s weakest member. This is why many species have evolved ways for their juveniles to find their footing within minutes of birth.
It’s an astonishing evolutionary feat that has long inspired biologists and roboticists. Now a team of USC researchers at the USC Viterbi School of Engineering believe they have become the first to create an AI-controlled robotic limb driven by animal-like tendons that can be tripped up and then recover within the time of the next footfall, a task for which the robot was never explicitly programmed to do.
Francisco J. Valero-Cuevas, a professor of Biomedical Engineering and professor of Biokinesiology & Physical Therapy at USC, in a project with USC Viterbi School of Engineering doctoral student Ali Marjaninejad and two other doctoral students—Darío Urbina-Meléndez and Brian Cohn, has developed a bio-inspired algorithm that can learn a new walking task by itself after only 5 minutes of unstructured play, and then adapt to other tasks without any additional programming.
Their article, outlined in the March cover article of Nature Machine Intelligence, opens exciting possibilities for understanding human movement and disability, creating responsive prosthetics, and robots that can interact with complex and changing environments like space exploration and search-and-rescue.
“Nowadays, it takes the equivalent of months or years of training for a robot to be ready to interact with the world, but we want to achieve the quick learning and adaptations seen in nature,” said senior author Valero-Cuevas, who also has appointments in computer science, electrical and computer engineering, aerospace and mechanical engineering and neuroscience at USC.
Marjaninejad, a doctoral candidate in the Department of Biomedical Engineering at USC, and the paper’s lead author, said this breakthrough is akin to the natural learning that happens in babies. Marjaninejad explains, the robot was first allowed to understand its environment in a process of free play (or what is known as ‘motor babbling’).
“These random movements of the leg allow the robot to build an internal map of its limb and its interactions with the environment,” said Marjaninejad.
The paper’s authors say that, unlike most current work, their robots learn-by-doing, and without any prior or parallel computer simulations to guide learning.
Marjaninejad also added this is particularly important because programmers can predict and code for multiple scenarios, but not for every possible scenario—thus pre-programmed robots are inevitably prone to failure.
“However, if you let these [new] robots learn from relevant experience, then they will eventually find a solution that, once found, will be put to use and adapted as needed. The solution may not be perfect, but will be adopted if it is good enough for the situation. Not every one of us needs or wants—or is able to spend the time and effort— to win an Olympic medal,” Marjaninejad said.
Through this process of discovering their body and environment, the robot limbs designed at Valero-Cuevas’ lab at USC use their unique experience to develop the gait pattern that works well enough for them, producing robots with personalized movements. “You can recognize someone coming down the hall because they have a particular footfall,” Valero-Cuevas said. “Our robot uses its limited experience to find a solution to a problem that then becomes its personalized habit, or ‘personality’—We get the dainty walker, the lazy walker, the champ… you name it.”
The potential applications for the technology are many, particularly in assistive technology, where robotic limbs and exoskeletons that are intuitive and responsive to a user’s personal needs would be invaluable to those who have lost the use of their limbs. “Exoskeletons or assistive devices will need to naturally interpret your movements to accommodate what you need,” Valero-Cuevas said.
“Because our robots can learn habits, they can learn your habits, and mimic your movement style for the tasks you need in everyday life—even as you learn a new task, or grow stronger or weaker.”
According to the authors, the research will also have strong applications in the fields of space exploration and rescue missions, allowing for robots that do what needs to be done without being escorted or supervised as they venture into a new planet, or uncertain and dangerous terrain in the wake of natural disasters. These robots would be able to adapt to low or high gravity, loose rocks one day and mud after it rains, for example.
The paper’s two additional authors, doctoral students Brian Cohn and Darío Urbina-Meléndez weighed in on the research:
“The ability for a species to learn and adapt their movements as their bodies and environments change has been a powerful driver of evolution from the start,” said Cohn, a doctoral candidate in computer science at the USC Viterbi School of Engineering. “Our work constitutes a step towards empowering robots to learn and adapt from each experience, just as animals do.”
“I envision muscle-driven robots, capable of mastering what an animal takes months to learn, in just a few minutes,” said Urbina-Meléndez, a doctoral candidate in biomedical engineering who believes in the capacity for robotics to take bold inspiration from life. “Our work combining engineering, AI, anatomy and neuroscience is a strong indication that this is possible.”
The Latest on: AI algorithms
via Google News
The Latest on: AI algorithms
- Transparent truths about AI ethics - assessing a seven point set of principles from Capgeminion October 8, 2020 at 2:41 am
Transparency isn't the silver bullet that's going to address every ethical concern around AI deployment, but it's an essential bedrock on which to build. (via Pixabay) With debate raging about bias ...
- This AI Deepfakes Reality In The Name Of Privacyon October 8, 2020 at 1:40 am
The deepfake technology I’m taking about is from a Berlin-based startup called Brighter AI, which provides privacy solutions for an increasingly surveilled world: license plate blurring and face ...
- UCSF, Fortanix, Intel, and Microsoft Azure Utilize Privacy-Preserving Analytics to Accelerate AI in Healthcareon October 7, 2020 at 12:09 pm
Fortanix, Intel, and Microsoft Azure today have formed a collaboration to establish a confidential computing platform with privacy-preserving analytics to accelerate the development and validation of ...
- AI's Quest to Make Us More Humanon October 7, 2020 at 11:15 am
To effectively navigate this change, we’ll need to rethink our views of work, identity and the value of human creativity. If we do it right, AI can give us the opportunity to reconnect with our most ...
- AI's coronavirus teston October 7, 2020 at 7:10 am
Some cutting-edge work aimed at gaining deeper understanding of the coronavirus is being led by the National Institutes of Health. There is much promise. Data that now sloshes around the health care ...
- Master Data Eats AI For Breakfaston October 7, 2020 at 5:43 am
Not addressing master data issues ahead of any heavy investments in AI/ML can result in the significant rewiring of data in the ML training sets later and render any previous insights useless.
- Robot reapers and AI: Just another day on the farmon October 7, 2020 at 5:00 am
Technology is now enabling farmers to manage each section of the farm based on its unique conditions and needs. A good example of this in action is the spraying of herbicides, which had traditionally ...
- This EU-funded AI rates how hideous your face is — for society’s sakeon October 6, 2020 at 8:54 am
A new facial recognition tool that infers your beauty, age, BMI, life expectancy, and gender from a photo exposes the risks of AI creep.
- Epic teamed up with UMinnesota to develop an AI algorithm that identifies COVID-19on October 6, 2020 at 6:16 am
Epic teamed up with researchers from the University of Minnesota to develop an AI algorithm that identifies COVID-19 from chest X-Rays.
- Team develops AI algorithm to analyze chest X-rays for COVID-19on October 1, 2020 at 10:57 am
A team of researchers at the University of Minnesota recently developed and validated an artificial intelligence algorithm that can evaluate chest X-rays to diagnose possible cases of COVID-19.
via Bing News