Researchers from NVIDIA, led by Stan Birchfield and Jonathan Tremblay, developed a first of its kind deep learning-based system that can teach a robot to complete a task by just observing the actions of a human. The method is designed to enhance communication between humans and robots and at the same time further research that will enable people to work alongside robots seamlessly.
“For robots to perform useful tasks in real-world settings, it must be easy to communicate the task to the robot; this includes both the desired result and any hints as to the best means to achieve that result,” the researchers stated in their research paper. “With demonstrations, a user can communicate a task to the robot and provide clues as to how to best perform the task.”
Using NVIDIA TITAN X GPUs, the researchers trained a sequence of neural networks to perform duties associated with perception, program generation, and program execution. As a result, the robot was able to learn a task from a single demonstration in the real world.
Once the robot sees a task, it generates a human-readable description of the steps necessary to re-perform the task. The description allows the user to quickly identify and correct any issues with the robot’s interpretation of the human demonstration before execution on the real robot.
The key to achieving this capability is leveraging the power of synthetic data to train the neural networks. Current approaches to training neural networks require large amounts of labeled training data, which is a serious bottleneck in these systems. With synthetic data generation, an almost infinite amount of labeled training data can be produced with very little effort.
This is also the first time an image-centric domain randomization approach has been used on a robot. Domain randomization is a technique to produce synthetic data with large amounts of diversity, which then fools the perception network into seeing the real-world data as simply another variation of its training data. The researchers chose to process the data in an image-centric manner to ensure that the networks are not dependent on the camera or environment.
“The perception network as described applies to any rigid real-world object that can be reasonably approximated by its 3D bounding cuboid,” the researchers said. “Despite never observing a real image during training, the perception network reliably detects the bounding cuboids of objects in real images, even under severe occlusions.”
For their demonstration, the team trained object detectors on several colored blocks and a toy car. The system was taught the physical relationship of blocks, whether they are stacked on top of one another or placed next to each other.
In the video above, the human operator shows a pair of stacks of cubes to the robot. The system then infers an appropriate program and correctly places the cubes in the correct order. Because it takes the current state of the world into account during execution, the system is able to recover from mistakes in real time.
The researchers will present their research paper and work at the International Conference on Robotics and Automation (ICRA), in Brisbane, Australia this week.
The team says they will continue to explore the use of synthetic training data for robotics manipulation to extend the capabilities of their method to additional scenarios.
via NVIDIA: Read the research paper
The Latest on: Robot learning
via Google News
The Latest on: Robot learning
- The $110,000 robot bartender mixing great cocktailson August 16, 2019 at 11:44 pm
One piece is the robot arm, which comes from car manufacturing ... In what ways could machine learning technology make Makr Shakr even more advanced? We are becoming big data owners. We’ve served more ...
- Kids as young as 4 learn coding from robots in this Palos Verdes Estates-based ‘Bots for Tots’ programon August 16, 2019 at 10:57 pm
At a long drawing table in the Rolling Robots playroom a line of 4- and 5-year-olds in a “Bots for Tots” class are learning beginner coding concepts with a little help from a roving robot named Ozobot ...
- Deal: Become a robotics expert for just $15on August 16, 2019 at 10:05 am
These eBooks are aimed at all experience levels, so even if your only robotics knowledge is from watching Wall-E, you can start learning now. The books keep you engaged by walking you through projects ...
- Robot Dog Astro Can Sit, Lie Down, and Save Liveson August 16, 2019 at 4:14 am
Using deep learning and artificial intelligence, scientists from Florida Atlantic University’s Machine Perception and Cognitive Robotics (MPCR) Laboratory are bringing Astro to life. A robotic tail ...
- Machine learning brings cell imaging promises into focuson August 16, 2019 at 2:54 am
What if machine vision and machine learning could see deeper patterns in the images these ... At Recursion’s headquarters in Utah, whirling clusters of robots treat half a million wells worth of cells ...
- Robot Software Market 2019: Insights and Forecast Research Report 2025on August 16, 2019 at 2:31 am
Service robots are also being used for surveying customers to seek reviews, helping retailers to understand customer perceptions. The utilization of machine learning technology in robotics enables the ...
- We’re teaching robots to build their own toolson August 15, 2019 at 10:08 pm
Using a technique called supervised learning, where the robot was shown objects to understand their properties and learn their uses, the bot was able to learn to put items together to carry out tasks.
- Astro the robot dog kinda sorta looks like a real dogon August 15, 2019 at 11:16 am
The robot's anatomy mimics that of its real-life canine inspiration ... Astro is trying to think like a dog thanks to artificial intelligence and deep learning technology in his computer "brain." The ...
- Indian startup behind the ‘playful learning robot’ Miko, raises $7.5 millionon August 13, 2019 at 9:30 pm
Robotics startup Emotix whose learning robot Miko is a runaway hit,, has raised $7.5 million in a series A round. This funding is led by Chiratae Ventures, YourNest Venture Capital, investor Bruno ...
via Bing News