THE R2-D2 ROBOT FROM STAR WARS DOESN’T COMMUNICATE IN HUMAN LANGUAGE BUT IS, NEVERTHELESS, CAPABLE OF SHOWING ITS INTENTIONS. FOR HUMAN-ROBOT INTERACTION, THE ROBOT DOES NOT HAVE TO BE A TRUE ‘HUMANOID’. PROVIDED THAT ITS SIGNALS ARE DESIGNED IN THE RIGHT WAY, UT RESEARCHER DAPHNE KARREMAN SAYS.
A human being will only be capable of communicating with robots if this robot has many human characteristics. That is the common idea. But mimicking natural movements and expressions is complicated, and some of our nonverbal communication is not really suitable for robots: wide arm gestures, for example. Humans prove to be capable of responding in a social way, even to machines that look like machines. We have a natural tendency of translating machine movements and signals to the human world. Two simple lenses on a machine can make people wave to the machine.
Knowing that, designing intuitive signals is challenging. In her research, Daphne Karreman focused on a robot functioning as a guide in a museum or a zoo. If the robot doesn’t have arms, can it still point to something the visitors have to look at? Using speech, written language, a screen, projection of images on a wall and specific movements, the robot has quite a number of ‘modalities’ that humans don’t have. Add to this playing with light and colour, and even a ‘low-anthropomorphic’ robot can be equipped with strong communication skills. It goes way beyond R2-D2 that communicates using beeps that need to be translated first. Karreman’s PhD thesis is therefore entitled ‘Beyond R2-D2’.
IN THE WILD
Karreman analysed a huge amount of video data to see how humans respond to a robot. Up to now, this type of research was mainly done in controlled lab situations, without other people present or after the test person was informed about what was going to happen. In this case, the robot was introduced ‘in the wild’ and in an unstructured way. People could come across the robot in the Real Alcázar Palace, Sevilla, for example. They decide for themselves if they want to be guided by a robot. What makes them keep distance, do people recognize what this robot is capable of?
To analyse these video data, Karreman developed a tool called Data Reduction Event Analysis Method (DREAM). The robot called Fun Robotic Outdoor Guide (FROG) has a screen, communicates using spoken language and light signals, and has a small pointer on its ‘head’. All by itself, FROG recognizes if people are interested in interaction and guidance. Thanks to the powerful DREAM tool, for the first time it is possible to analyse and classify human-robot interaction in a fast and reliable way. Unlike other methods, DREAM will not interpret all signals immediately, but it compares several ‘coders’ for a reliable and reproducible result.
How many people show interest, do they join the robot during the entire tour, do they respond as expected? It is possible to evaluate this using questionnaires, but that places the robot in a special position: people primarily come to visit the expo or zoo and not for meeting a robot. Using the DREAM tool, spontaneous interaction becomes more visible and thus, robot behaviour can be optimized.
Learn more: ROBOT DOESN’T HAVE TO BE HUMAN LOOK-ALIKE
The Latest on: Human-robot interaction
via Google News
The Latest on: Human-robot interaction
- Impact Assessment of Robotics in the Logistics Sectoron July 10, 2019 at 6:40 am
Efficient human-robot interaction, reduced footprint, ease of integration, reprogram ability and creation of flexible environment are encouraging advancements of robotics in logistics. Download ... […]
- Do we want to create true human robots?on July 9, 2019 at 7:06 pm
but he and his team are actively working on a true human robot. This is what he and his team are working on. “...the emergent robotics that seeks for the design principle of robot behaviours through ... […]
- Intelligent Transport Systems Enhance Human-Robot Collaborationon July 5, 2019 at 5:35 am
The goal of human and robotics interaction in manufacturing ... to ensure safety at manual workstations using limit values that have been defined for human-robot collaboration in technical ... […]
- Control Engineering hot topics, May 2019on June 1, 2019 at 12:07 am
Hot topics in Control Engineering, for May 2019, included the VFDs, safe human-robot interaction, sustainable cybersecurity, object-oriented programming, and the Career and Salary Survey. These are ... […]
- How this roboticist is advancing the art of human-robot interactionon May 22, 2019 at 4:21 am
Almost a third of a nurse’s average day “is spent on non-patient care: fetching, gathering, even taking out the trash,” says Andrea Thomaz, who, after a career spent running robotics labs at the ... […]
- With New Patent Granted, AKA Takes a Step Closer to More Effective Human-Robot Interactionon April 23, 2019 at 10:05 am
SANTA MONICA, Calif., April 23, 2019 /PRNewswire/ -- AKA, an AI development company, today announced the issuance of PCT Patent (PCT/KR2018/006493, REG 1019653720000) for "Method of Determining ... […]
- With New Patent Granted, AKA Takes a Step Closer to More Effective Human-Robot Interactionon April 22, 2019 at 6:45 pm
(MENAFN - Newswire.com LLC) SANTA MONICA, Calif., April 23, 2019 (Newswire.com) - AKA , an AI development company, today announced the issuance of PCT Patent (PCT/KR2018/006493, REG 1019653720000) for ... […]
- COMAN+ Takes Human-Robot Interaction to Next Levelon March 1, 2019 at 10:29 am
Playing a ball game and moving furniture may seem like two completely different tasks, but they have a couple of things in common. First, it’s easier to do it together. Then, these seemingly ... […]
- Differential game theory for versatile physical human–robot interactionon January 7, 2019 at 11:14 am
The last decades have seen a surge of robots working in contact with humans. However, until now these contact robots have made little use of the opportunities offered by physical interaction and lack ... […]
via Bing News