Robots today can perform space missions, solve a Rubik’s cube, sort hospital medication and even make pancakes. But most can’t manage the simple act of grasping a pencil and spinning it around to get a solid grip.
Intricate tasks that require dexterous in-hand manipulation — rolling, pivoting, bending, sensing friction and other things humans do effortlessly with our hands — have proved notoriously difficult for robots.
Now, a University of Washington team of computer scientists and engineers has built a robot hand that can not only perform dexterous manipulation but also learn from its own experience without needing humans to direct it.
Their latest results are detailed in a paper to be presented May 17 at the IEEE International Conference on Robotics and Automation.
“Hand manipulation is one of the hardest problems that roboticists have to solve,” said lead author Vikash Kumar, a UW doctoral student in computer science and engineering. “A lot of robots today have pretty capable arms but the hand is as simple as a suction cup or maybe a claw or a gripper.”
By contrast, the UW research team spent years custom building one of the most highly capable five-fingered robot hands in the world. Then they developed an accurate simulation model that enables a computer to analyze movements in real time. In their latest demonstration, they apply the model to the hardware and real-world tasks like rotating an elongated object.
With each attempt, the robot hand gets progressively more adept at spinning the tube, thanks to machine learning algorithms that help it model both the basic physics involved and plan which actions it should take to achieve the desired result. (This demonstration begins at 1:47 in the video.)
This autonomous learning approach developed by the UW Movement Control Laboratorycontrasts with robotics demonstrations that require people to program each individual movement of the robot’s hand in order to complete a single task.
“Usually people look at a motion and try to determine what exactly needs to happen —the pinky needs to move that way, so we’ll put some rules in and try it and if something doesn’t work, oh the middle finger moved too much and the pen tilted, so we’ll try another rule,” said senior author and lab director Emo Todorov, UW associate professor of computer science and engineering and of applied mathematics.
“It’s almost like making an animated film — it looks real but there was an army of animators tweaking it,” Todorov said. “What we are using is a universal approach that enables the robot to learn from its own movements and requires no tweaking from us.”
Building a dexterous, five-fingered robot hand poses challenges, both in design and control. The first involved building a mechanical hand with enough speed, strength responsiveness and flexibility to mimic basic behaviors of a human hand.
The UW’s dexterous robot hand — which the team built at a cost of roughly $300,000 — uses a Shadow Hand skeleton actuated with a custom pneumatic system and can move faster than a human hand. It is too expensive for routine commercial or industrial use, but it allows the researchers to push core technologies and test innovative control strategies.
“There are a lot of chaotic things going on and collisions happening when you touch an object with different fingers, which is difficult for control algorithms to deal with,” said co-authorSergey Levine, UW assistant professor of computer science and engineering who worked on the project as a postdoctoral fellow at University of California, Berkeley. “The approach we took was quite different from a traditional controls approach.”
The team first developed algorithms that allowed a computer to model highly complex five-fingered behaviors and plan movements to achieve different outcomes — like typing on a keyboard or dropping and catching a stick — in simulation.
Most recently, the research team has transferred the models to work on the actual five-fingered hand hardware, which never proves to be exactly the same as a simulated scenario. As the robot hand performs different tasks, the system collects data from various sensors and motion capture cameras and employs machine learning algorithms to continually refine and develop more realistic models.
“It’s like sitting through a lesson, going home and doing your homework to understand things better and then coming back to school a little more intelligent the next day,” said Kumar.
So far, the team has demonstrated local learning with the hardware system — which means the hand can continue to improve at a discrete task that involves manipulating the same object in roughly the same way. Next steps include beginning to demonstrate global learning — which means the hand could figure out how to manipulate an unfamiliar object or a new scenario it hasn’t encountered before.
The Latest on: Robot learns by experience
via Google News
The Latest on: Robot learns by experience
- Kids as young as 4 learn coding from robots in this Palos Verdes Estates-based ‘Bots for Tots’ programon August 16, 2019 at 10:57 pm
... how to learn and with that, they will eventually surpass us.” Rolling Robots is located at 700 Silver Spur Drive in Rolling Hills Estates. There are also locations in Glendale and West Los Angeles ...
- Deal: Become a robotics expert for just $15on August 16, 2019 at 10:05 am
Now you can learn how to build and program robots and save over $180 in the process ... These eBooks are aimed at all experience levels, so even if your only robotics knowledge is from watching Wall-E ...
- Q&A with AI Developer Henk Boelman: Hands-On with Microsoft's New AI and IoT Technologieson August 16, 2019 at 7:48 am
360, "Build Your Own A.I. Powered Robot." This AI quick start combines Windows IoT, a Raspberry Pi with some sensors and Microsoft Cognitive Services to help attendees learn how a Raspberry ... Some ...
- Robot Dog Astro Can Sit, Lie Down, and Save Liveson August 16, 2019 at 4:14 am
He learns like one, too: Astro is being trained via deep neural network to learn from experience and perform real-life tasks ... More on Geek.com: Robots to Compete in Underground Challenge in Mining ...
- Automation Might Be Here, But Industry 4.0 Is Still Far Offon August 15, 2019 at 11:47 am
While ubiquitous automation tools exist, like Surface Mount Assembly lines or Kuka robots assembling the body of cars, many would be surprised to learn just how many manual ... from the status quo – ...
- This Boston Dynamics-Esque Bot Has Horrifying, Human-Like Eyeson August 15, 2019 at 10:03 am
Housed inside Astro’s 3D-printed, Doberman pinscher-like head is a computer system that uses deep learning to “learn from experience to ... and a handful of other robots like it are still ...
- Huskie Robotics Team bringing 100-pound robot Clementine to demo at the Nichols Library Aug. 24on August 14, 2019 at 4:16 pm
"Robotics students learn and compete like crazy ... team and other students about what they can expect building robots, going to competitions and what they have learned through their experience," said ...
- 3 Reasons Why Language Technology Should Be Your Globalization 'Cobot'on August 14, 2019 at 10:58 am
Collaborative robots (aka cobots) are usually defined as physical robots ... As it is now powered by AI, language technology mimics the human ability to understand, reason and learn from cross-media ...
- Sorry Robots: Humans Crave Humans in Online Interactionson August 12, 2019 at 7:37 am
The human agent can also record the problem so the AI system can learn and translate better the next time ... most complex neural networks known today—and it’s unlikely that robots will ever catch up ...
- Epson Robots Introduces AutomateElite System Integrator Programon August 12, 2019 at 3:00 am
The authorized system integrators are hand-selected by Epson based on automation experience, technical competency, and Epson Robots portfolio ... and Latin America. To learn more about Epson ...
via Bing News