To prove that sound can be an asset to robots, SCS researchers built a dataset by recording video and audio of 60 common objects as they rolled around a tray. The team captured these interactions using Tilt-Bot — a square tray attached to the arm of a Sawyer robot.
Carnegie Mellon builds dataset capturing interaction of sound, action, vision
People rarely use just one sense to understand the world, but robots usually only rely on vision and, increasingly, touch. Carnegie Mellon University researchers find that robot perception could improve markedly by adding another sense: hearing.
In what they say is the first large-scale study of the interactions between sound and robotic action, researchers at CMU’s Robotics Institute found that sounds could help a robot differentiate between objects, such as a metal screwdriver and a metal wrench. Hearing also could help robots determine what type of action caused a sound and help them use sounds to predict the physical properties of new objects.
“A lot of preliminary work in other fields indicated that sound could be useful, but it wasn’t clear how useful it would be in robotics,” said Lerrel Pinto, who recently earned his Ph.D. in robotics at CMU and will join the faculty of New York University this fall. He and his colleagues found the performance rate was quite high, with robots that used sound successfully classifying objects 76 percent of the time.
The results were so encouraging, he added, that it might prove useful to equip future robots with instrumented canes, enabling them to tap on objects they want to identify.
The researchers presented their findings last month during the virtual Robotics Science and Systems conference. Other team members included Abhinav Gupta, associate professor of robotics, and Dhiraj Gandhi, a former master’s student who is now a research engineer at Facebook Artificial Intelligence Research’s Pittsburgh lab.
To perform their study, the researchers created a large dataset, simultaneously recording video and audio of 60 common objects — such as toy blocks, hand tools, shoes, apples and tennis balls — as they slid or rolled around a tray and crashed into its sides. They have since released this dataset, cataloging 15,000 interactions, for use by other researchers.
The team captured these interactions using an experimental apparatus they called Tilt-Bot — a square tray attached to the arm of a Sawyer robot. It was an efficient way to build a large dataset; they could place an object in the tray and let Sawyer spend a few hours moving the tray in random directions with varying levels of tilt as cameras and microphones recorded each action.
They also collected some data beyond the tray, using Sawyer to push objects on a surface.
Though the size of this dataset is unprecedented, other researchers have also studied how intelligent agents can glean information from sound. For instance, Oliver Kroemer, assistant professor of robotics, led research into using sound to estimate the amount of granular materials, such as rice or pasta, by shaking a container, or estimating the flow of those materials from a scoop.
Pinto said the usefulness of sound for robots was therefore not surprising, though he and the others were surprised at just how useful it proved to be. They found, for instance, that a robot could use what it learned about the sound of one set of objects to make predictions about the physical properties of previously unseen objects.
“I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” he said. For instance, a robot couldn’t use sound to tell the difference between a red block or a green block. “But if it was a different object, such as a block versus a cup, it could figure that out.”
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
- Neuralink Is Impressive Tech, Wrapped in Musk Hypeon September 4, 2020 at 10:40 am
Cool research tool? Definitely. A way to remotely control a computer or a prosthetic? Sure. But reading minds and storing memories? Not so fast.
- Victa Has Unveiled a Robot Lawn Mower So You Can Just Sit Back With a Bevvyon September 2, 2020 at 8:24 am
Like a Roomba for your overgrown yard, Victa is releasing its first robot lawn mower so you can flick it on and sit back with a cuppa or some fun juice.
- Elon Musk unveils updated Neuralink brain implant design and surgical roboton September 2, 2020 at 4:39 am
Elon Musk's neuroscience startup Neuralink has revealed the "dramatically simplified" design for an implant that aims to create brain-to-machine interfaces, alongside the robot that will insert it and ...
- Disinfecting Robots in Use at RWJ Rahwayon September 1, 2020 at 9:58 am
Robert Wood Johnson University Hospital Rahway uses disinfecting robots to kill disease-causing pathogens. The two Thor UV-C robots disinfect the hospital’s Operating Rooms and ...
- Medical Robots Market Research Insights 2019 Global Industry Outlook Shared in Detailed Report, Forecast to 2025on August 31, 2020 at 1:43 am
The report covers detailed competitive outlook including the market share and company profiles of the key participants operating in the global market. Key players profiled in the report include ...
Go deeper with Google Headlines on:
Go deeper with Bing News on:
- Robotic health care is coming to a hospital near youon September 5, 2020 at 9:50 am
Medical robots are helping doctors and other professionals save time, lower costs and shorten patient recovery times, but patients may not be ready. Our research into human perceptions of automated ...
- How Christopher Nolan's Interstellar Explores Time and Memoryon September 3, 2020 at 8:14 am
In the latest part of our look back at the films of Christopher Nolan, we revisit Interstellar, where humanity teeters on the edge of oblivion... and tries to remember what binds it together.
- What coronavirus means for the future of self-driving carson September 3, 2020 at 5:00 am
It's 2025 and driverless cars still aren't zooming around everywhere. Where are the chilled out passengers on their phones, or napping, as an invisible "driver" navigates a crowded intersection?
- Electronic SKIN that 'feels' pain in the same way as human skin could pave the way for better prosthetics, smarter robotics and non-invasive skin graft alternativeson September 1, 2020 at 9:53 am
The breakthrough from academics at RMIT University in Australia replicates human nerves with electrical signals to trigger an immediate reaction.
- People are now willing to accept social robots as companions due to COVID-19 isolation: University of Waterloo studyon August 31, 2020 at 1:42 pm
Living through the COVID-19 pandemic is making people more open to the use of intelligent robots as companions.