Simple system can recognize sixty percent of human touches
A SQUEEZE IN THE ARM, A PAT ON THE SHOULDER, OR A SLAP IN THE FACE – TOUCH IS AN IMPORTANT PART OF THE SOCIAL INTERACTION BETWEEN PEOPLE. SOCIAL TOUCH, HOWEVER, IS A RELATIVELY UNKNOWN FIELD WHEN IT COMES TO ROBOTS, EVEN THOUGH ROBOTS OPERATE WITH INCREASING FREQUENCY IN SOCIETY AT LARGE, RATHER THAN JUST IN THE CONTROLLED ENVIRONMENT OF A FACTORY.
Merel Jung is conducting research at the University of Twente CTIT research institute into social touch interaction with robots. Using a relatively simple system – a mannequin’s arm with pressure sensors, connected to a computer – she has succeeded in getting it to recognize sixty percent of all touches. The research is being published in the Journal on Multimodal User Interfaces scientific journal.
Robots are becoming more and more social. A well-known example of a social robot is Paro, a robot seal that is used in care homes, where it has a calming effect on the elderly residents and stimulates their senses. Positive results have been achieved with the robot for this target group, but we still have a long way to go before robots can correctly recognize, interpret, and respond to different types of social touch in the way that people can. It is a relatively little explored area in science, but one in which much could be achieved in the long term. Examples that come to mind are robots that assist children with autism in improving their social contacts, or robots that train medicine students for real-life situations.
Merel Jung is therefore carrying out research at the University of Twente into social touch interaction between humans and robots. In order to enable a robot to respond in the correct manner to being touched, she has identified four different stages. The robot must perceive, be able to recognize, interpret, and then respond in the correct way. In this phase of her research, Jung focused on the first two stages – perceiving and recognizing. With a relatively simple experiment, involving a mannequin’s arm fitted with 64 pressure sensors, she has succeeded in distinguishing sixty percent of almost 8,000 touches (distributed over fourteen different types of touch at three levels of intensity). Sixty percent does not seem very high on the face of it, but it is a good figure if you bear in mind that there was absolutely no social context and that various touches are very similar to each other. Possible examples include the difference between grabbing and squeezing, or stroking roughly and rubbing gently. In addition, the people touching the mannequin’s arm had been given no instructions on how to ‘perform’ their touches, and the computer system was not able to ‘learn’ how the individual ‘touchers’ operated. In similar circumstances, people too would not be able to correctly recognize every single touch. In her follow-up research, which Jung is currently undertaking, she is concentrating on how robots can interpret touch in a social context. It is expected that robots, by interpreting the context, will be better able to respond to touch correctly, and that therefore the touch robot will be one step closer to reality.
Learn more: First Steps Towards The Touch Robot
The Latest on: Touch robot
via Google News
The Latest on: Touch robot
- Children battling cancer can’t always express their feelings. Now a robotic duck is doing it for them. on June 20, 2019 at 6:00 am
A partnership between the insurance company Aflac, whose company mascot is a duck, and the robotics toy company Sproutel, the social robot, known as “My Special Aflac Duck,” uses a series of touch ... […]
- iRobot Introduces the Root Coding Robot Through Acquisition of Root Robotics on June 20, 2019 at 5:05 am
Root robot is uniquely designed to help kids learn ... play music, respond to touch and sound, climb whiteboard walls, and explore the fundamentals of robotics. Root uses three levels of coding ... […]
- Touchy-Feely Robots: MIT’s New AI System Can Identify Things Using Sight, Touch on June 18, 2019 at 6:42 am
Learning about the environment using touch and sight comes easily to humans, but robots can’t use these senses interchangeably like we do. Researchers are now trying to teach robots how to identify ... […]
- MIT researchers taught robots to link senses like sight and touch on June 17, 2019 at 6:06 pm
MIT researchers at the Computer Science and Artificial Intelligence Lab (CSAIL) have created a predictive AI that allows robots to link multiple senses in much the same way humans do. “While our ... […]
- MIT teaches robots to 'feel' objects just by looking at them on June 17, 2019 at 11:26 am
Robots can be equipped with visual and touch sensors, but it's tougher for them to combine that information. "The two directions is only possible because we humans have this synchronized information ... […]
- MIT Teaching Robots to Combine Sight and Touch on June 17, 2019 at 10:39 am
When able, humans use their senses in tandem. Hearing a voice, a person turns around to see if it's their friend. Touching an object can offer tactile information, but viewing it can confirm its true ... […]
- MIT researchers develop robot that can learn to identify objects based on sight and touch on June 17, 2019 at 7:56 am
Robots are getting closer to being able to see and feel the physical world. A team of researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) developed AI software ... […]
- One Step Closer to Human Intelligence - MIT CSAIL Combine Sight And Touch in AI on June 17, 2019 at 7:49 am
This breakthrough could lead to far more sensitive and practical robotic arms that could improve any number of delicate or mission-critical operations. It also promises more to come in the advancement ... […]
- MIT Robot Learns to ID Objects by Sight, Touch on June 17, 2019 at 6:40 am
Geek Pick: Abode iota Is an All-In-One Security Kit What to Stream on Amazon Prime This Weekend What to Stream on Hulu This Weekend What to Stream on Netflix This Weekend Humans’ five senses work ... […]
via Bing News