Electrical engineers at the University of California San Diego have developed a faster collision detection algorithm that uses machine learning to help robots avoid moving objects and weave through complex, rapidly changing environments in real time. The algorithm, dubbed “Fastron,” runs up to 8 times faster than existing collision detection algorithms.
A team of engineers, led by Michael Yip, a professor of electrical and computer engineering and member of the Contextual Robotics Institute at UC San Diego, will present the new algorithm at the first annual Conference on Robot Learning Nov. 13 to 15 at Google headquarters in Mountain View, Calif. The conference brings the top machine learning scientists to an invitation-only event. Yip’s team will deliver one of the long talks during the 3-day conference.
The team envisions that Fastron will be broadly useful for robots that operate in human environments where they must be able to work with moving objects and people fluidly. One application they are exploring in particular is robot-assisted surgeries using the da Vinci Surgical System, in which a robotic arm would autonomously perform assistive tasks (suction, irrigation or pulling tissue back) without getting in the way of the surgeon-controlled arms or the patient’s organs.
“This algorithm could help a robot assistant cooperate in surgery in a safe way,” Yip said.
The team also envisions that Fastron can be used for robots that work at home for assisted living applications, as well as for computer graphics for the gaming and movie industry, where collision checking is often a bottleneck for most algorithms.
A problem with existing collision detection algorithms is that they are very computation-heavy. They spend a lot of time specifying all the points in a given space—the specific 3D geometries of the robot and obstacles—and performing collision checks on every single point to determine whether two bodies are intersecting at any given time. The computation gets even more demanding when obstacles are moving.
To lighten the computational load, Yip and his team in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego developed a minimalistic approach to collision detection. The result was Fastron, an algorithm that uses machine learning strategies—which are traditionally used to classify objects—to classify collisions versus non-collisions in dynamic environments. “We actually don’t need to know all the specific geometries and points. All we need to know is whether the robot’s current position is in collision or not,” said Nikhil Das, an electrical engineering Ph.D. student in Yip’s group and the study’s first author.
Fastron simulation: the autonomous arm (blue arm) reaches the target configuration (wireframe arm) while avoiding the motions of a human-controlled arm (red arm). Image courtesy of ARClab at UC San Diego.
The Fastron algorithm
The name Fastron comes from combining Fast and Perceptron, which is a machine learning technique for performing classification. An important feature of Fastron is that it updates its classification boundaries very quickly to accommodate for moving scenes, something that has been challenging for the machine learning community in general to do.
Fastron’s active learning strategy works using a feedback loop. It starts out by creating a model of the robot’s configuration space, or C-space, which is the space showing all possible positions the robot can attain. Fastron models the C-space using just a sparse set of points, consisting of a small number of so-called collision points and collision-free points. The algorithm then defines a classification boundary between the collision and collision-free points—this boundary is essentially a rough outline of where the abstract obstacles are in the C-space. As obstacles move, the classification boundary changes. Rather than performing collision checks on each point in the C-space, as is done with other algorithms, Fastron intelligently selects checks near the boundaries. Once it classifies the collisions and non-collisions, the algorithm updates its classifier and then continues the cycle.
Because Fastron’s models are more simplistic, the researchers set its collision checks to be more conservative. Since just a few points represent the entire space, Das explained, it’s not always certain what’s happening in the space between two points, so the team developed the algorithm to predict a collision in that space. “We leaned toward making a risk-averse model and essentially padded the workspace obstacles,” Das said. This ensures that the robot can be tuned to be more conservative in sensitive environments like surgery, or for robots that work at home for assisted living.
The team has so far demonstrated the algorithm in computer simulations on robots and obstacles in simulation. Moving forward, the team is working to further improve the speed and accuracy of Fastron. Their goal is to implement Fastron in a robotic surgery and a homecare robot setting.
The Latest on: Robots interacting with humans
- This Incredible Robot Can Learn From Humans Just by Watching Them on December 10, 2018 at 5:04 pm
PR2 learns how to interact and sort objects after just one demonstration ... and previous research generally required that a robot be trained by another robot. Human limbs simply don’t move like robot... […]
- Yale teaches robots not to mess with people’s stuff on December 7, 2018 at 12:11 pm
because ownership is a human idea. Still, if you want a robot to avoid touching your stuff or interacting with something, you typically have to hard code some sort of limitation. If we want them ... […]
- Don’t believe the hype: There’s nothing wrong with the space station robot on December 6, 2018 at 1:21 pm
CIMON, an AI-powered robot developed by IBM and Airbus, recently acted perfectly normal during interactions with human crew aboard the International Space Station (ISS). A slew of journalists don ... […]
- A conceptual framework for modeling human-robot trust on December 6, 2018 at 6:32 am
Trust plays a key role in interpersonal interactions, both in professional and personal ... wanted to create a framework that could be used to model interpersonal trust between humans and robots. "We ... […]
- Fog robotics: A new approach to achieve efficient and fluent human-robot interaction on November 30, 2018 at 6:52 am
Researchers at the Innovation and Enterprise Research Laboratory (The Magic Lab) of the University of Technology Sydney have proposed a new robotics architecture called fog robotics (FR). Their approa... […]
- Furhat, a robot with the human touch, wants to hear your woes on November 29, 2018 at 1:46 pm
The robot, a three-dimensional bust with a projection of a human-like face, aims to build on our new-found ease talking to voice assistants like Siri and Alexa, by persuading us to interact with it as ... […]
- Flexible electronic skin aids human-machine interactions on November 28, 2018 at 2:14 pm
Human skin contains sensitive nerve cells that detect pressure, temperature and other sensations that allow tactile interactions with the environment. To help robots and prosthetic devices attain thes... […]
- Robot carers could help lonely seniors — they're cheering humans up already on November 23, 2018 at 5:07 am
But while robots might be able to provide care and, in some cases, social interaction, many wonder if they really are the right solution to this uniquely human issue. One recent survey found that whil... […]
- Scientists make robots more expressive, have deeper interactions with humans on November 17, 2018 at 11:59 pm
TOKYO: Japanese scientists have found a way to make faces of human-like robots more expressive, paving the way for machines to show a greater range of emotions, and ultimately have deeper interactions ... […]
- How to Create Robots That Can Deal With Unpredictable Humans on November 5, 2018 at 6:02 am
This can be hard enough for fellow humans to navigate, much less a robot. Incidentally, I often tell students in my HRI class, "If you think it's frustrating when a robot does something you don't anti... […]
via Google News and Bing News