Babies learn about the world by exploring how their bodies move in space, grabbing toys, pushing things off tables and by watching and imitating what adults are doing.
But when roboticists want to teach a robot how to do a task, they typically either write code or physically move a robot’s arm or body to show it how to perform an action.
Now a collaboration between University of Washington developmental psychologists and computer scientists has demonstrated that robots can “learn” much like kids — by amassing data through exploration, watching a human do something and determining how to perform that task on its own.
“You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans,” said senior author Rajesh Rao, a UW professor of computer science and engineering.
“If you want people who don’t know anything about computer programming to be able to teach a robot, the way to do it is through demonstration — showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own.”
The research, which combines child development research from the UW’s Institute for Learning & Brain Sciences Lab (I-LABS) with machine learning approaches, was published in a paper in November in the journal PLOS ONE.
In the paper, the UW team developed a new probabilistic model aimed at solving a fundamental challenge in robotics: building robots that can learn new skills by watching people and imitating them.
The roboticists collaborated with UW psychology professor and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that children as young as 18 months can infer the goal of an adult’s actions and develop alternate ways of reaching that goal themselves.
In one example, infants saw an adult try to pull apart a barbell-shaped toy, but the adult failed to achieve that goal because the toy was stuck together and his hands slipped off the ends. The infants watched carefully and then decided to use alternate methods — they wrapped their tiny fingers all the way around the ends and yanked especially hard — duplicating what the adult intended to do.
Children acquire intention-reading skills, in part, through self-exploration that helps them learn the laws of physics and how their own actions influence objects, eventually allowing them to amass enough knowledge to learn from others and to interpret their intentions. Meltzoff thinks that one of the reasons babies learn so quickly is that they are so playful.
“Babies engage in what looks like mindless play, but this enables future learning. It’s a baby’s secret sauce for innovation,” Meltzoff said. “If they’re trying to figure out how to work a new toy, they’re actually using knowledge they gained by playing with other toys. During play they’re learning a mental model of how their actions cause changes in the world. And once you have that model you can begin to solve novel problems and start to predict someone else’s intentions.”
Rao’s team used that infant research to develop machine learning algorithms that allow a robot to explore how its own actions result in different outcomes. Then the robot uses that learned probabilistic model to infer what a human wants it to do and complete the task, and even to “ask” for help if it’s not certain it can.
The team tested its robotic model in two different scenarios: a computer simulation experiment in which a robot learns to follow a human’s gaze, and another experiment in which an actual robot learns to imitate human actions involving moving toy food objects to different areas on a tabletop.
In the gaze experiment, the robot learns a model of its own head movements and assumes that the human’s head is governed by the same rules. The robot tracks the beginning and ending points of a human’s head movements as the human looks across the room and uses that information to figure out where the person is looking. The robot then uses its learned model of head movements to fixate on the same location as the human.
The team also recreated one of Meltzoff’s tests that showed infants who had experience with visual barriers and blindfolds weren’t interested in looking where a blindfolded adult was looking, because they understood the person couldn’t actually see. Once the team enabled the robot to “learn” what the consequences of being blindfolded were, it no longer followed the human’s head movement to look at the same spot.
The Latest on: Robots that learn
via Google News
The Latest on: Robots that learn
- ECDHUB and Award-Winning STEM Subscription for Kids Announce Reduced Pricing of $1/month for First 5000on March 23, 2020 at 9:25 am
SUGARLAND, Texas, March 23, 2020 /PRNewswire/ -- ECDHUB has reduced their fee to just $1/MONTH for the first 5000 subscribers to benefit the community during this difficult time, with schools being ...
- If Robots Steal So Many Jobs, Why Aren't They Saving Us Now?on March 23, 2020 at 5:00 am
As a kid, you learn to do that through trial and error, whereas you’d have to program a robot with explicit instructions to do the same. During the pandemic, this contrast between humans and machines ...
- Chinese robot maker sees demand surge amid coronaviruson March 20, 2020 at 5:30 pm
BEIJING — While other industries struggle, Liu Zhiyong says China’s virus outbreak is boosting demand for his knee-high, bright-yellow robots to deliver groceries and patrol malls looking for shoppers ...
- Covid-19: Expat student creates sanitiser robot in Dubaion March 20, 2020 at 5:01 pm
Making the most out of the virtual school day learning is Siddh Sanghvi, a Grade 7 student of Spring Dales Schools in Dubai. Motivated by the motto "stay safe and be clean", he has created a robot ...
- Google taps computer vision to improve robot manipulation performanceon March 20, 2020 at 4:06 pm
They say that their proposed technique — affordance-based manipulation — can enable robots to learn to pick and grasp objects in less than 10 minutes of trial and error, which could lay the groundwork ...
- Astro Boy (1980) - The Top 10 Robots From The Anime (Who Are Not Astro)on March 20, 2020 at 1:38 pm
Bora is a towering wheel of a robot and said to have two-million horsepower, which is superior to Bruton's one-million. After learning from Astro about the value of friendship and peace, Bruton is ...
- RoboPony: Chinese robot maker sees demand surge amid viruson March 19, 2020 at 11:45 pm
While other industries struggle, Liu Zhiyong says China’s virus outbreak is boosting demand for his knee-high, bright yellow robots to deliver groceries and patrol malls looking for customers who fail ...
- Testing robots, toilet paper thefts, legal fights: News from around our 50 stateson March 19, 2020 at 11:19 pm
Gulf Shores: The state on Thursday ordered the closure of day cares, beaches and on-site dining in restaurants as it tries to contain the spread of the coronavirus. The measure expands restrictions ...
- What America can learn from China's use of robots and telemedicine to combat the coronaviruson March 18, 2020 at 11:06 am
China has opened a field hospital staffed by robots to serve 20,000 patients to relieve health-care workers overwhelmed by the coronavirus. It is also using telemedicine to treat the sick.
- Teach your kids to code with Botzees augmented reality robots, now on sale for $70on March 17, 2020 at 3:15 pm
Imagine building robots with Lego-like bricks and then using them to solve puzzles in augmented reality. Your kids are living in the future. Looking for a more productive use of your kid's screen time ...
via Bing News