Approach may enable robots to move around hospitals, malls, and other areas with heavy foot traffic.
Just as drivers observe the rules of the road, most pedestrians follow certain social codes when navigating a hallway or a crowded thoroughfare: Keep to the right, pass on the left, maintain a respectable berth, and be ready to weave or change course to avoid oncoming obstacles while keeping up a steady walking pace.
Now engineers at MIT have designed an autonomous robot with “socially aware navigation,” that can keep pace with foot traffic while observing these general codes of pedestrian conduct.
In drive tests performed inside MIT’s Stata Center, the robot, which resembles a knee-high kiosk on wheels, successfully avoided collisions while keeping up with the average flow of pedestrians. The researchers have detailed their robotic design in a paper that they will present at the IEEE Conference on Intelligent Robots and Systems in September.
“Socially aware navigation is a central capability for mobile robots operating in environments that require frequent interactions with pedestrians,” says Yu Fan “Steven” Chen, who led the work as a former MIT graduate student and is the lead author of the study. “For instance, small robots could operate on sidewalks for package and food delivery. Similarly, personal mobility devices could transport people in large, crowded spaces, such as shopping malls, airports, and hospitals.”
Chen’s co-authors are graduate student Michael Everett, former postdoc Miao Liu, and Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics at MIT.
In order for a robot to make its way autonomously through a heavily trafficked environment, it must solve four main challenges: localization (knowing where it is in the world), perception (recognizing its surroundings), motion planning (identifying the optimal path to a given destination), and control (physically executing its desired path).
Chen and his colleagues used standard approaches to solve the problems of localization and perception. For the latter, they outfitted the robot with off-the-shelf sensors, such as webcams, a depth sensor, and a high-resolution lidar sensor. For the problem of localization, they used open-source algorithms to map the robot’s environment and determine its position. To control the robot, they employed standard methods used to drive autonomous ground vehicles.
“The part of the field that we thought we needed to innovate on was motion planning,” Everett says. “Once you figure out where you are in the world, and know how to follow trajectories, which trajectories should you be following?”
That’s a tricky problem, particularly in pedestrian-heavy environments, where individual paths are often difficult to predict. As a solution, roboticists sometimes take a trajectory-based approach, in which they program a robot to compute an optimal path that accounts for everyone’s desired trajectories. These trajectories must be inferred from sensor data, because people don’t explicitly tell the robot where they are trying to go.
“But this takes forever to compute. Your robot is just going to be parked, figuring out what to do next, and meanwhile the person’s already moved way past it before it decides ‘I should probably go to the right,’” Everett says. “So that approach is not very realistic, especially if you want to drive faster.”
Others have used faster, “reactive-based” approaches, in which a robot is programmed with a simple model, using geometry or physics, to quickly compute a path that avoids collisions.
The problem with reactive-based approaches, Everett says, is the unpredictability of human nature — people rarely stick to a straight, geometric path, but rather weave and wander, veering off to greet a friend or grab a coffee. In such an unpredictable environment, such robots tend to collide with people or look like they are being pushed around by avoiding people excessively.
“The knock on robots in real situations is that they might be too cautious or aggressive,” Everett says. “People don’t find them to fit into the socially accepted rules, like giving people enough space or driving at acceptable speeds, and they get more in the way than they help.”
The team found a way around such limitations, enabling the robot to adapt to unpredictable pedestrian behavior while continuously moving with the flow and following typical social codes of pedestrian conduct.
They used reinforcement learning, a type of machine learning approach, in which they performed computer simulations to train a robot to take certain paths, given the speed and trajectory of other objects in the environment. The team also incorporated social norms into this offline training phase, in which they encouraged the robot in simulations to pass on the right, and penalized the robot when it passed on the left.
“We want it to be traveling naturally among people and not be intrusive,” Everett says. “We want it to be following the same rules as everyone else.”
The advantage to reinforcement learning is that the researchers can perform these training scenarios, which take extensive time and computing power, offline. Once the robot is trained in simulation, the researchers can program it to carry out the optimal paths, identified in the simulations, when the robot recognizes a similar scenario in the real world.
The researchers enabled the robot to assess its environment and adjust its path, every one-tenth of a second. In this way, the robot can continue rolling through a hallway at a typical walking speed of 1.2 meters per second, without pausing to reprogram its route.
“We’re not planning an entire path to the goal — it doesn’t make sense to do that anymore, especially if you’re assuming the world is changing,” Everett says. “We just look at what we see, choose a velocity, do that for a tenth of a second, then look at the world again, choose another velocity, and go again. This way, we think our robot looks more natural, and is anticipating what people are doing.”
Everett and his colleagues test-drove the robot in the busy, winding halls of MIT’s Stata Building, where the robot was able to drive autonomously for 20 minutes at a time. It rolled smoothly with the pedestrian flow, generally keeping to the right of hallways, occasionally passing people on the left, and avoiding any collisions.
“We wanted to bring it somewhere where people were doing their everyday things, going to class, getting food, and we showed we were pretty robust to all that,” Everett says. “One time there was even a tour group, and it perfectly avoided them.”
Everett says going forward, he plans to explore how robots might handle crowds in a pedestrian environment.
“Crowds have a different dynamic than individual people, and you may have to learn something totally different if you see five people walking together,” Everett says. “There may be a social rule of, ‘Don’t move through people, don’t split people up, treat them as one mass.’ That’s something we’re looking at in the future.”
The Latest on: Socially aware robot navigation
- Facebook claims AI navigation milestone after models achieve 99.99% accuracyon January 21, 2020 at 3:19 pm
In tests conducted by the social ... autonomous navigation algorithms would be useful in quite a few areas. The technology could, for instance, enable the development of domestic robots that can ...
- The AI Eye: GBT Technologies (OTCPINK: GTCH) to Expand Autonomous Machines Research, Accenture (NYSE: ACN) Opens Innovation Center in Singaporeon January 21, 2020 at 2:17 pm
Our simulation includes the interaction of robots with the environment and its responding behavior in real time. A major part of the simulation will include the robot navigation activities within its ...
- New Laser Eyes For Postmates Delivery Robot, Courtesy Of Ousteron July 24, 2019 at 11:29 am
Postmates is equipping the latest version of its Serve urban-delivery robots with laser lidar ... Postmates’ Socially-Aware-Navigation system pilots the shopping-cart-size Serve units.
- Path Planning for Socially-Aware Humanoid Robotson May 10, 2019 at 5:06 pm
We have architected a navigation framework for socially-aware autonomous robot navigation, using only the on-board computing resources. Our goal is to foster the development of several important ...
- Startups Promise the Future But It Doesn't Always Arriveon December 14, 2018 at 4:25 am
"While Postmates says the robot will be powered by its "Socially-Aware-Navigation system," Wired noted in its write-up of the announcement, "the rovers are remotely supervised by a Postmates ...
- Technique could enable robots to navigate pushy crowds, congested streetson October 8, 2018 at 4:17 am
Also: Amazon may be building an Alexa home robot CNET In a paper entitled "Deep sequential models for sampling-based planning," the researchers outline a method of robot navigation that utilizes ...
- MIT engineers develop ‘socially aware’ robot to keep pace with pedestrianson August 30, 2017 at 9:59 am
The autonomous robot uses “socially aware navigation” software to keep pace with foot traffic and observe the “general codes of pedestrian conduct,” the Massachusetts Institute of ...
- MIT’s new Ford-funded robot can deftly navigate pedestrian trafficon August 30, 2017 at 7:40 am
That’s why an MIT team led by MIT researcher Yu Fan “Steven” Chen set out to create a robot that can move around completely on its own using “socially aware navigation” – in other ...
- New robot rolls with the rules of pedestrian conducton August 30, 2017 at 7:02 am
IMAGE: Engineers at MIT have designed an autonomous robot with "socially aware navigation, " that can keep pace with foot traffic while observing these general codes of pedestrian conduct.
- Humans must overcome distrust of robots, say researcherson January 20, 2017 at 4:38 am
Social pedestrian navigation, such as walking down a crowded sidewalk, is something humans take for granted, but the actual process is quite sophisticated – especially if you're a robot.
via Google News and Bing News