Roboticists learn to teach robots from babies – robots that learn

A collaboration between UW developmental psychologists and computer scientists aims to enable robots to learn in the same way that children naturally do. The team used research on how babies follow an adult’s gaze to “teach” a robot to perform the same task.University of Washington

A collaboration between UW developmental psychologists and computer scientists aims to enable robots to learn in the same way that children naturally do. The team used research on how babies follow an adult’s gaze to “teach” a robot to perform the same task.University of Washington

Babies learn about the world by exploring how their bodies move in space, grabbing toys, pushing things off tables and by watching and imitating what adults are doing.

But when roboticists want to teach a robot how to do a task, they typically either write code or physically move a robot’s arm or body to show it how to perform an action.

Now a collaboration between University of Washington developmental psychologists and computer scientists has demonstrated that robots can “learn” much like kids — by amassing data through exploration, watching a human do something and determining how to perform that task on its own.

“You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans,” said senior author Rajesh Rao, a UW professor of computer science and engineering.

“If you want people who don’t know anything about computer programming to be able to teach a robot, the way to do it is through demonstration — showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own.”

The research, which combines child development research from the UW’s Institute for Learning & Brain Sciences Lab (I-LABS) with machine learning approaches, was published in a paper in November in the journal PLOS ONE.

In the paper, the UW team developed a new probabilistic model aimed at solving a fundamental challenge in robotics: building robots that can learn new skills by watching people and imitating them.

The roboticists collaborated with UW psychology professor and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that children as young as 18 months can infer the goal of an adult’s actions and develop alternate ways of reaching that goal themselves.

In one example, infants saw an adult try to pull apart a barbell-shaped toy, but the adult failed to achieve that goal because the toy was stuck together and his hands slipped off the ends. The infants watched carefully and then decided to use alternate methods — they wrapped their tiny fingers all the way around the ends and yanked especially hard — duplicating what the adult intended to do.

Children acquire intention-reading skills, in part, through self-exploration that helps them learn the laws of physics and how their own actions influence objects, eventually allowing them to amass enough knowledge to learn from others and to interpret their intentions. Meltzoff thinks that one of the reasons babies learn so quickly is that they are so playful.

“Babies engage in what looks like mindless play, but this enables future learning. It’s a baby’s secret sauce for innovation,” Meltzoff said. “If they’re trying to figure out how to work a new toy, they’re actually using knowledge they gained by playing with other toys. During play they’re learning a mental model of how their actions cause changes in the world. And once you have that model you can begin to solve novel problems and start to predict someone else’s intentions.”

Rao’s team used that infant research to develop machine learning algorithms that allow a robot to explore how its own actions result in different outcomes. Then the robot uses that learned probabilistic model to infer what a human wants it to do and complete the task, and even to “ask” for help if it’s not certain it can.

The team tested its robotic model in two different scenarios: a computer simulation experiment in which a robot learns to follow a human’s gaze, and another experiment in which an actual robot learns to imitate human actions involving moving toy food objects to different areas on a tabletop.

In the gaze experiment, the robot learns a model of its own head movements and assumes that the human’s head is governed by the same rules. The robot tracks the beginning and ending points of a human’s head movements as the human looks across the room and uses that information to figure out where the person is looking. The robot then uses its learned model of head movements to fixate on the same location as the human.

The team also recreated one of Meltzoff’s tests that showed infants who had experience with visual barriers and blindfolds weren’t interested in looking where a blindfolded adult was looking, because they understood the person couldn’t actually see. Once the team enabled the robot to “learn” what the consequences of being blindfolded were, it no longer followed the human’s head movement to look at the same spot.

Read more: UW roboticists learn to teach robots from babies

 

 

The Latest on: Robots that learn

via  Bing News

 

Hummingbird: An Educational Robotics Kit Designed To Get Girls Into Engineering

Helps teach and inspire anyone who didn’t think they wanted to make robots or learn about programming

Though it turns out the kit–which shows that engineering can be a creative and artistic endeavor–helps teach and inspire anyone who didn’t think they wanted to make robots or learn about programming.

Want to make a robot of cardboard wrapped in tin foil that can twirl, flash lights and even impersonate the Star Wars robot, R2D2? Or a dragon of paper and popsicle sticks that flaps its wings and hisses? How about building a robotic arm with muscles fashioned from pantyhose? That’s what middle and high-schools students have been doing, using a new educational robotics kit. Made by two-year-old startup, BirdBrain Technologies, in Pittsburgh, the kit is called Hummingbird and was developed at Carnegie Mellon University’s Robotics Institute. BirdBrain Technologies is a Carnegie Mellon University (CMU) spin-off.

Most educational robotic kits focus on building robots, but Hummingbird treats robotics as one element combined with craft materials and text to communicate thoughts, feelings, or ideas. Students neither have to know how to solder nor program. The kit ($199) consists of a customized control board, and lights, sensors and motors that can be connected to the controller by inserting the leads of parts into plastic clamps on the board. They program their creations on a computer by dragging and dropping icons, so they don’t have to learn computer languages. Teachers whose students have experimented with the kit say it fosters interest in technology among students ages 11 and up.

Read more . . .

via FastCoExist
 

The Latest Streaming News: Educational Robotics updated minute-by-minute

 

 

Open Source Robotics Platform

PR2

Until someone develops a common platform for building robots (think of the combination of Windows and Intel that has made PCs so accessible), the technology will remain elusive to the general public.

At least that’s the contention of Willow Garage, Inc., a Menlo Park, Calif. company that Wednesday made its PR2 personal robot available to the public.

PR2 comes with the basics: a mobile base, two arms for manipulation, a suite of sensors and two computers, each with eight processing cores, 24 gigabytes of RAM and two terabytes of hard disk . Willow Garage is hoping that its robot will blossom with the help of an open community of devoted engineers and software developers that can build on the PR2’s basics and share their breakthroughs with each other. Call it open source for robots.

The field of robotics needs to become more standardized if it is to flourish, says Keenan Wyrobek, co-director of Willow Garage’s Personal Robotics Program. “PR2 is all about taking us from where we are today to where you can pretty much make your own robot as needed,” he says.

The PR2 comes with a robot operating system (ROS), which handles the robot’s computation and hardware manipulation functions, to name a few. The ROS, like open-source software, is free and can be tweaked by users, as long as any improvements are shared back with the rest of the community of PR2 and ROS users. This community is key to the PR2’s success because it opens up the project to ideas and input from engineers around the world who know how to write programs for robotics navigation, vision, movement and other functions.

Read more . . .

Enhanced by Zemanta