Electrical engineers at the University of California San Diego have developed a faster collision detection algorithm that uses machine learning to help robots avoid moving objects and weave through complex, rapidly changing environments in real time. The algorithm, dubbed “Fastron,” runs up to 8 times faster than existing collision detection algorithms.
A team of engineers, led by Michael Yip, a professor of electrical and computer engineering and member of the Contextual Robotics Institute at UC San Diego, will present the new algorithm at the first annual Conference on Robot Learning Nov. 13 to 15 at Google headquarters in Mountain View, Calif. The conference brings the top machine learning scientists to an invitation-only event. Yip’s team will deliver one of the long talks during the 3-day conference.
The team envisions that Fastron will be broadly useful for robots that operate in human environments where they must be able to work with moving objects and people fluidly. One application they are exploring in particular is robot-assisted surgeries using the da Vinci Surgical System, in which a robotic arm would autonomously perform assistive tasks (suction, irrigation or pulling tissue back) without getting in the way of the surgeon-controlled arms or the patient’s organs.
“This algorithm could help a robot assistant cooperate in surgery in a safe way,” Yip said.
The team also envisions that Fastron can be used for robots that work at home for assisted living applications, as well as for computer graphics for the gaming and movie industry, where collision checking is often a bottleneck for most algorithms.
A problem with existing collision detection algorithms is that they are very computation-heavy. They spend a lot of time specifying all the points in a given space—the specific 3D geometries of the robot and obstacles—and performing collision checks on every single point to determine whether two bodies are intersecting at any given time. The computation gets even more demanding when obstacles are moving.
To lighten the computational load, Yip and his team in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego developed a minimalistic approach to collision detection. The result was Fastron, an algorithm that uses machine learning strategies—which are traditionally used to classify objects—to classify collisions versus non-collisions in dynamic environments. “We actually don’t need to know all the specific geometries and points. All we need to know is whether the robot’s current position is in collision or not,” said Nikhil Das, an electrical engineering Ph.D. student in Yip’s group and the study’s first author.
Fastron simulation: the autonomous arm (blue arm) reaches the target configuration (wireframe arm) while avoiding the motions of a human-controlled arm (red arm). Image courtesy of ARClab at UC San Diego.
The Fastron algorithm
The name Fastron comes from combining Fast and Perceptron, which is a machine learning technique for performing classification. An important feature of Fastron is that it updates its classification boundaries very quickly to accommodate for moving scenes, something that has been challenging for the machine learning community in general to do.
Fastron’s active learning strategy works using a feedback loop. It starts out by creating a model of the robot’s configuration space, or C-space, which is the space showing all possible positions the robot can attain. Fastron models the C-space using just a sparse set of points, consisting of a small number of so-called collision points and collision-free points. The algorithm then defines a classification boundary between the collision and collision-free points—this boundary is essentially a rough outline of where the abstract obstacles are in the C-space. As obstacles move, the classification boundary changes. Rather than performing collision checks on each point in the C-space, as is done with other algorithms, Fastron intelligently selects checks near the boundaries. Once it classifies the collisions and non-collisions, the algorithm updates its classifier and then continues the cycle.
Because Fastron’s models are more simplistic, the researchers set its collision checks to be more conservative. Since just a few points represent the entire space, Das explained, it’s not always certain what’s happening in the space between two points, so the team developed the algorithm to predict a collision in that space. “We leaned toward making a risk-averse model and essentially padded the workspace obstacles,” Das said. This ensures that the robot can be tuned to be more conservative in sensitive environments like surgery, or for robots that work at home for assisted living.
The team has so far demonstrated the algorithm in computer simulations on robots and obstacles in simulation. Moving forward, the team is working to further improve the speed and accuracy of Fastron. Their goal is to implement Fastron in a robotic surgery and a homecare robot setting.
The Latest on: Robots interacting with humans
Humans Show Racial Bias Towards Robots of Different Colors: Study
on July 17, 2018 at 5:18 pm
Therefore, racism has not been on the radar for almost all robot creators. The members of the Human-Robot Interaction community have worked already for many years to better understand the interaction ... […]
Game devs have three choices interacting online: be a robot, log off — or risk the mob
on July 17, 2018 at 4:11 pm
Because only humans can care, and the human beings deployed by an employer are given just three options: the choice to stay silent (as ArenaNet’s Mike O’Brien suggested in a follow-up statement), the ... […]
This wristband lets humans control machines with their minds
on July 16, 2018 at 12:28 pm
robots and applications by tracking the electrical activity generated when a person thinks about moving. Backed by tech giants Amazon and Alphabet, the company is building neural interfaces to make ou... […]
Humans Plus Robots: Why the Two Are Better Than Either One Alone
on July 12, 2018 at 11:54 am
For instance, BMW has found that robot/human teams were about 85% more productive than the ... decisions as they are working on industrial equipment. The workers can interact with the AI agents to get ... […]
Chinese Bank Replaces Humans With Robots And VR Kits
on April 26, 2018 at 1:23 pm
This new bank hopes to offer up a convenient and personable experience to its users while mimicking a lot of the pay apps and cashless payment systems that modern consumers are used to interacting ... ... […]
Robot posture and movement style affects how humans interact with them
on March 26, 2018 at 4:37 pm
It seems obvious that the way a robot moves would affect how people interact with it, and whether they consider it easy or safe to be near. But what poses and movement types specifically are reassurin... […]
Nutley school uses robots to help students with autism
on January 19, 2018 at 1:54 am
The Phoenix Center in Nutley is using a special robot teacher as part of its Robot-Human Interaction program. The small robot is designed to help students with special needs advance in their learning. ... […]
Eyeris Partners with iPal® Robot to Enable Face-to-Face Interaction
on January 10, 2018 at 5:00 am
The two companies announced their intent to build upon iPal Robot’s already strong human interaction capabilities by integrating Eyeris’ Emotion Recognition and Face Analytics capabilities. Eyeris wil... […]
Affectiva CEO: AI needs emotional intelligence to facilitate human-robot interaction
on December 9, 2017 at 11:15 am
Affectiva, one in a series of companies to come out of MIT’s Media Lab whose work revolves around affective computing, used to be best known for sensing emotion in videos. It recently expanded into em... […]
Chinese university develops robots that can interact with humans
on November 17, 2017 at 6:58 pm
A researcher (L) and robot Xiang Xiang conduct synchronous interaction at Hefei University of Technology in Hefei, capital of east China's Anhui Province, Nov. 16, 2017. Robots Si Si and Xiang Xiang, ... […]
via Google News and Bing News