It’s often been said that the eyes are the window to the soul, revealing what we think and how we feel. Now, new research reveals that your eyes may also be an indicator of your personality type, simply by the way they move.
Developed by the University of South Australia in partnership with the University of Stuttgart, Flinders University and the Max Planck Institute for Informatics in Germany, the research uses state-of-the-art machine-learning algorithms to demonstrate a link between personality and eye movements.
Findings show that people’s eye movements reveal whether they are sociable, conscientious or curious, with the algorithm software reliably recognising four of the Big Five personality traits: neuroticism, extroversion, agreeableness, and conscientiousness.
Researchers tracked the eye movements of 42 participants as they undertook everyday tasks around a university campus, and subsequently assessed their personality traits using well-established questionnaires.
UniSA’s Dr Tobias Loetscher says the study provides new links between previously under-investigated eye movements and personality traits and delivers important insights for emerging fields of social signal processing and social robotics.
“There’s certainly the potential for these findings to improve human-machine interactions,” Dr Loetscher says.
“People are always looking for improved, personalised services. However, today’s robots and computers are not socially aware, so they cannot adapt to non-verbal cues.
“This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals.”
Dr Loetscher says the findings also provide an important bridge between tightly controlled laboratory studies and the study of natural eye movements in real-world environments.
“This research has tracked and measured the visual behaviour of people going about their everyday tasks, providing more natural responses than if they were in a lab.
“And thanks to our machine-learning approach, we not only validate the role of personality in explaining eye movement in everyday life, but also reveal new eye movement characteristics as predictors of personality traits.”
Receive an email update when we add a new HUMAN-MACHINE INTERACTIONS article.
The Latest on: Human-machine interactions
via Google News
The Latest on: Human-machine interactions
- Mobile Phone LED Market: Key Vendor (Cree, Lumileds, Osram and Samsung) on December 14, 2018 at 11:28 pm
Display panels are the surfaces used for display and control components which acts as the direct interface for the human/machine interaction. LED applications in mobile phones have expanded over the y... […]
- University of Guelph launches AI ethics centre amid growing debate on privacy, bias on December 14, 2018 at 8:20 am
The University of Guelph is launching a new hub for artificial intelligence to grapple with ethical questions amidst growing concern around issues of privacy, bias and human-machine interaction. […]
- View from China: Designing the blueprint for smart living on December 14, 2018 at 2:26 am
“Human-machine interaction is becoming prevalent,” Xu said the company is still learning about the technology, especially exploring how to make products more human and personalized. This where the cha... […]
- New models sense human trust in smart machines on December 11, 2018 at 10:54 am
The journal’s special issue is titled "Trust and Influence in Intelligent Human-Machine Interaction." The paper was authored by mechanical engineering graduate student Kumar Akash; former graduate stu... […]
- Human Machine Interface (HMI) market detailed in new research report on December 4, 2018 at 4:11 am
Report at www.reportsnreports.com/contacts/.aspx?name=1730874. The report considers the sale of industrial interfaces for human-machine interaction at the manufacturer level to arrive at the market si... […]
- New flexible electronic skin aids human-machine interactions on November 29, 2018 at 1:26 am
New York, Nov 29 (IANS) Scientists have created a fast and inexpensive new method to develop an electronic skin that can aid robots and prosthetic devices to attain abilities akin to human skin that c... […]
- Flexible electronic skin aids human-machine interactions on November 28, 2018 at 9:51 am
Human skin contains sensitive nerve cells that detect pressure, temperature and other sensations that allow tactile interactions with the environment. To help robots and prosthetic devices attain thes... […]
via Bing News