By enabling them to ask a question when they’re confused, an algorithm developed at Brown University helps robots get better at fetching objects, an important task for future robot assistants.
If someone asks you to hand them a wrench from a table full of different sized wrenches, you’d probably pause and ask, “which one?” Robotics researchers from Brown University have now developed an algorithm that lets robots do the same thing — ask for clarification when they’re not sure what a person wants.
The research, which will be presented this spring at the International Conference on Robotics and Automation in Singapore, comes from Brown’s Humans to Robots Lab led by computer science professor Stefanie Tellex. Her work focuses on human-robot collaboration — making robots that can be good helpers to people at home and in the workplace.
“Fetching objects is an important task that we want collaborative robots to be able to do,” Tellex said. “But it’s easy for the robot to make errors, either by misunderstanding what we want, or by being in situations where commands are ambiguous. So what we wanted to do here was come up with a way for the robot to ask a question when it’s not sure.”
Tellex’s lab had previously developed an algorithm that enables robots to receive speech commands as well as information from human gestures. It’s a form of interaction that people use all the time. When we ask someone for an object, we’ll often point to it at the same time. Tellex and her team showed that when robots could combine the speech commands with gestures, they got better at correctly interpreting user commands.
Still, the system isn’t perfect. It runs into problems when there are lots of very similar objects in close proximity to each other. Take the workshop table, for example. Simply asking for “a wrench” isn’t specific enough, and it might not be clear which one a person is pointing to if a number of wrenches are clustered close together.
“What we want in these situations is for the robot to be able to signal that it’s confused and ask a question rather than just fetching the wrong object,” Tellex said.
The new algorithm does that. It enables the robot to quantify how certain it is that it knows what a user wants. When its certainty is high, the robot will simply hand over the object as requested. When it’s not so certain, the robot makes its best guess about what the person wants, then asks for confirmation by hovering its gripper over the object and asking, “this one?”
One of the important features of the system is that the robot doesn’t ask questions with every interaction. It asks intelligently.
“When the robot is certain, we don’t want it to ask a question because it just takes up time,” said Eric Rosen, an undergraduate working in Tellex’s lab and co-lead author of the research paper with graduate student David Whitney. “But when it is ambiguous, we want it to ask questions because mistakes can be more costly in terms of time.”
And even though the system asks only a very simple question, “it’s able to make important inferences based on the answer,” Whitney said. For example, say a user asks for a wrench and there are two wrenches on a table. If the user tells the robot that its first guess was wrong, the algorithm deduces that the other wrench must be the one that the user wants. It will then hand that one over without asking another question. Those kinds of inferences, known as implicatures, make the algorithm more efficient.
To test their system, the researchers asked untrained participants to come into the lab and interact with Baxter, a popular industrial and research robot. Participants asked Baxter for objects under different conditions. The team could set the robot to never ask questions, ask a question every time, or to ask questions only when uncertain. The trials showed that asking questions intelligently using the new algorithm was significantly better in terms of accuracy and speed compared to the other two conditions.
The system worked so well, in fact, that participants thought the robot had capabilities it actually didn’t have. For the purposes of the study, the researchers used a very simple language model — one that only understood the names of objects. However, participants told the researchers they thought the robot could understand prepositional phrases like, “on the left” or “closest to me,” which it could not. They also thought the robot might be tracking their eye-gaze, which it wasn’t. All the system was doing was making smart inferences after asking a very simple question.
In future work, Tellex and her team would like to combine the algorithm with more robust speech recognition systems, which might further increase the system’s accuracy and speed.
Ultimately, Tellex says, she hopes systems like this will help robots become useful collaborators both at home and at work.
Receive an email update when we add a new ROBOT ASSISTANTS article.
The Latest on: Robot assistants
via Google News
The Latest on: Robot assistants
- Best Buy is running a $50 discount on a popular Roomba robot vacuum right now, bringing its price down to $280on September 6, 2019 at 1:29 pm
iRobot's Roomba is one of the most well known robot vacuum brands on the market. For a limited time, the Roomba 675 is discounted by $50 at Best Buy.
- The iRobot Roomba 675 app-controlled robot vacuum gets $50 off on Best Buy todayon September 6, 2019 at 12:00 pm
You can even sync this robot vacuum with your Google Assistant or Amazon Alexa device to enable voice control. And once this smart vacuum starts to clean, it will keep on going until the job is done ...
- Global Robotics Revenue to Reach $248.5 Billion by 2025, as the Market for Non-Industrial Robots Maintains Strong Growth, According to Tracticaon September 6, 2019 at 6:05 am
challenges around pricing for personal assistant robots, and investors ultimately running out of patience with many of the robotics startups.” Still, adds Kaul, robotics industry innovation and ...
- Save $200 by snagging the iRobot Roomba 960 robot vacuum on Amazonon September 5, 2019 at 2:30 pm
You can set up and make this robot vacuum clean in three ways: Manually pressing the clean button, using the mobile app, or via voice command through Alexa or Google Assistant. It has three cleaning ...
- Singing, Dancing Robot Joins Children's Hospital of Philadelphia to Be Kid Patients' Companionon September 5, 2019 at 2:18 pm
A robot that will bring a mix of artificial intelligence and ... in education and research," the company says on its website. "NAO is also used as an assistant by companies and healthcare centers to ...
- Samsung's robot chef arms just helped make me a delicious meal at IFAon September 5, 2019 at 6:12 am
Samsung's Bot Chef robot, demonstrated at IFA in Berlin ... The ultimate promise for the AI is to predict what you want before you even ask, though most smart assistants aren't that smart yet. Samsung ...
- Russian ISS robot Fedor trades in gun on Earth for electric drill in spaceon September 3, 2019 at 1:37 pm
Disturbing much? Roscosmos is taking Fedor for a test drive as a possible assistant for space exploration missions. The robot can dare to tread in places where humans won't want to go. Fedor's ...
- Salad-making robots are coming to Houstonon September 2, 2019 at 10:08 pm
California-based chain CaliBurger partnered with Miso Robotics to develop Flippy, a burger-flipping robot hailed as the world’s first autonomous kitchen assistant. Spyce, a Boston restaurant, features ...
- Baxter and Pepper: The robotic nurses that may help ease staffing cruncheson September 2, 2019 at 6:00 am
Advertising Demiris said AI is necessary for robotic assistants to learn about the preferences and abilities of those they help, “so as to personalize their assistance and maximize its benefits.” The ...
via Bing News