A new spin on virtual reality helps engineers read robots’ minds
In a darkened, hangar-like space inside MIT’s Building 41, a small, Roomba-like robot is trying to make up its mind.
Standing in its path is an obstacle — a human pedestrian who’s pacing back and forth. To get to the other side of the room, the robot has to first determine where the pedestrian is, then choose the optimal route to avoid a close encounter.
As the robot considers its options, its “thoughts” are projected on the ground: A large pink dot appears to follow the pedestrian — a symbol of the robot’s perception of the pedestrian’s position in space. Lines, each representing a possible route for the robot to take, radiate across the room in meandering patterns and colors, with a green line signifying the optimal route. The lines and dots shift and adjust as the pedestrian and the robot move.
This new visualization system combines ceiling-mounted projectors with motion-capture technology and animation software to project a robot’s intentions in real time. The researchers have dubbed the system “measurable virtual reality (MVR) — a spin on conventional virtual reality that’s designed to visualize a robot’s “perceptions and understanding of the world,” says Ali-akbar Agha-mohammadi, a postdoc in MIT’s Aerospace Controls Lab.
“Normally, a robot may make some decision, but you can’t quite tell what’s going on in its mind — why it’s choosing a particular path,” Agha-mohammadi says. “But if you can see the robot’s plan projected on the ground, you can connect what it perceives with what it does to make sense of its actions.”
Agha-mohammadi says the system may help speed up the development of self-driving cars, package-delivering drones, and other autonomous, route-planning vehicles.
“As designers, when we can compare the robot’s perceptions with how it acts, we can find bugs in our code much faster,” Agha-mohammadi says. “For example, if we fly a quadrotor, and see something go wrong in its mind, we can terminate the code before it hits the wall, or breaks.”
The system was developed by Shayegan Omidshafiei, a graduate student, and Agha-mohammadi. They and their colleagues, including Jonathan How, a professor of aeronautics and astronautics, will present details of the visualization system at the American Institute of Aeronautics and Astronautics’ SciTech conference in January.
Seeing into the mind of a robot
The researchers initially conceived of the visualization system in response to feedback from visitors to their lab. During demonstrations of robotic missions, it was often difficult for people to understand why robots chose certain actions.
The Latest on: Robot’s intentions
via Google News
The Latest on: Robot’s intentions
- Elliot's Third Personality & Whiterose May Be Working Together On 'Mr. Robot'on December 1, 2019 at 8:13 am
It was Mr. Robot who made Elliot forget his childhood abuse ... With only five episodes to go before the end, it will have to make its presence — and its intentions — known soon, for good or for evil.
- Online asynchronous decoding of error-related potentials during the continuous control of a roboton November 26, 2019 at 2:18 am
When an ErrP was detected after the error onset, participants regained the control of the robot and could finish the trial ... which sometimes leads to a misinterpretation of the user’s intention and ...
- Robotic ‘dog’ joins ranks of Massachusetts State Policeon November 25, 2019 at 11:30 pm
“Robot technology is a valuable tool for law enforcement because of its ability to ... told the outlet that authorities need to clearly set out their intentions. “There’s enough of a visceral reaction ...
- Predicting people’s driving personalitieson November 20, 2019 at 5:59 pm
“Working with and around humans means figuring out their intentions to better understand their behavior,” says graduate student ... In addition, they will be investigating other robotic systems acting ...
- Emergence Season 1 Episode 7 Review: Fatal Exceptionon November 19, 2019 at 9:46 pm
Emily's intentions for Piper came into focus as she attempted to convince the human-like ... Are we supposed to believe that she's doing all of this because she wasn't love as a child? Or that forcing ...
- MIT Researchers Teach Autonomous Cars to Predict Driver Behavioron November 19, 2019 at 7:20 am
https://t.co/i6tIgWF3FX “Working with and around humans means figuring out their intentions to better understand their behavior,” graduate ... and other components of driving environments, as well as ...
- How Selfish Are You? It Matters for MIT’s New Self-Driving Algorithmon November 19, 2019 at 6:00 am
“Working with and around humans means figuring out their intentions to better understand their behavior,” said graduate student ... maybe it’s less unsettling than thinking of self-driving cars as pre ...
- Getting some SVO for your (auto) SUVon November 18, 2019 at 12:22 pm
“Working with and around humans means figuring out their intentions to better understand their behaviour,” says Wilko Schwarting ... “By modelling driving personalities and incorporating the models ...
- 'Mr. Robot' Recap: Explaining The Most Tragic Episode Yeton November 18, 2019 at 11:42 am
The episode is structured much like a 5-act play centered around three main characters: Elliot (and Mr. Robot), newest villain Fernando Vera, and Elliot’s therapist, Krista Gordon. Taking place ...
- The future of autonomous delivery may be unfolding in an unlikely place: Suburban Houstonon November 18, 2019 at 10:45 am
As the vehicle moves, the co-driver, who has a laptop that monitors the robotic system, verbally informs the driver of the robot’s intentions before they occur - a cue known as a “call-out.” The ...
via Bing News