People are social creatures. Robots … not so much.
When we think of robots, we think of cold, metallic computers without emotion. If science fiction has taught us anything, though, it’s that we crave emotion, even in our robots – think C-3PO or Star Trek’s Data. So it stands to reason that if robots are ever going to become a fixture in our society, even becoming integrated into our households, we need to be able to read their faces. But how good are we at reading robot faces?
Scientists at Georgia Tech decided to test our ability to interpret a robot’s “emotion” by reading its expression to see if there were any differences between the ages. They found that older adults showed some unexpected differences in the way they read a robot’s face from the way younger adults performed. The research is being presented this week at the Human Factors and Ergonomics Society Annual Meeting in San Antonio.
“Home-based assistive robots have the potential to help older adults age in place. They have the potential to keep older adults independent longer, reduce healthcare needs and provide everyday assistance,” said Jenay Beer, graduate student in Georgia Tech’s School of Psychology.
Related articles by Zemanta
- Older Adults Want Robots That Do More Than Vacuum, Researchers Find (innovationtoronto.com)