Simple system can recognize sixty percent of human touches
A SQUEEZE IN THE ARM, A PAT ON THE SHOULDER, OR A SLAP IN THE FACE – TOUCH IS AN IMPORTANT PART OF THE SOCIAL INTERACTION BETWEEN PEOPLE. SOCIAL TOUCH, HOWEVER, IS A RELATIVELY UNKNOWN FIELD WHEN IT COMES TO ROBOTS, EVEN THOUGH ROBOTS OPERATE WITH INCREASING FREQUENCY IN SOCIETY AT LARGE, RATHER THAN JUST IN THE CONTROLLED ENVIRONMENT OF A FACTORY.
Merel Jung is conducting research at the University of Twente CTIT research institute into social touch interaction with robots. Using a relatively simple system – a mannequin’s arm with pressure sensors, connected to a computer – she has succeeded in getting it to recognize sixty percent of all touches. The research is being published in the Journal on Multimodal User Interfaces scientific journal.
Robots are becoming more and more social. A well-known example of a social robot is Paro, a robot seal that is used in care homes, where it has a calming effect on the elderly residents and stimulates their senses. Positive results have been achieved with the robot for this target group, but we still have a long way to go before robots can correctly recognize, interpret, and respond to different types of social touch in the way that people can. It is a relatively little explored area in science, but one in which much could be achieved in the long term. Examples that come to mind are robots that assist children with autism in improving their social contacts, or robots that train medicine students for real-life situations.
Merel Jung is therefore carrying out research at the University of Twente into social touch interaction between humans and robots. In order to enable a robot to respond in the correct manner to being touched, she has identified four different stages. The robot must perceive, be able to recognize, interpret, and then respond in the correct way. In this phase of her research, Jung focused on the first two stages – perceiving and recognizing. With a relatively simple experiment, involving a mannequin’s arm fitted with 64 pressure sensors, she has succeeded in distinguishing sixty percent of almost 8,000 touches (distributed over fourteen different types of touch at three levels of intensity). Sixty percent does not seem very high on the face of it, but it is a good figure if you bear in mind that there was absolutely no social context and that various touches are very similar to each other. Possible examples include the difference between grabbing and squeezing, or stroking roughly and rubbing gently. In addition, the people touching the mannequin’s arm had been given no instructions on how to ‘perform’ their touches, and the computer system was not able to ‘learn’ how the individual ‘touchers’ operated. In similar circumstances, people too would not be able to correctly recognize every single touch. In her follow-up research, which Jung is currently undertaking, she is concentrating on how robots can interpret touch in a social context. It is expected that robots, by interpreting the context, will be better able to respond to touch correctly, and that therefore the touch robot will be one step closer to reality.
Learn more: First Steps Towards The Touch Robot
The Latest on: Touch robot
via Google News
The Latest on: Touch robot
- Latest Electronic Skin Innovation Gives Robots and Prosthetics a Better Sense of Touchon July 20, 2019 at 1:28 am
Latest electronic skin innovation has been developed that gives a sense of touch better than human skin. This will facilitate creating skins for robots and more realistic human prosthetic limbs that ... […]
- An artificial nervous system called ACES could give robots a sense of touchon July 19, 2019 at 7:33 am
Scientists have created a new system called Asynchronous Coded Electronic Skin (ACES) that they say provides a sense of touch that is at least equivalent to human skin and could be better. The new ... […]
- It’s not easy giving a robot a sense of touchon July 19, 2019 at 7:06 am
We have robots that can walk, see, talk and hear, and manipulate objects in their robotic hands. There’s even a robot that can smell. But what about a sense of touch? This is easier said than ... […]
- Fortnite Patch Notes 9.40 (Update): Giant Robot, New Shotgun, Deep Dab Nerfon July 18, 2019 at 1:07 pm
Added new Edit From Touch setting to enable/disable editing from touches (defaulted ... that a large battle between the monster formerly beneath Polar Peak and the giant robot being assembled at the ... […]
- New e-skin innovation gives robots and prosthetics an exceptional sense of touchon July 18, 2019 at 9:06 am
Researchers have developed an ultra responsive and robust artificial nervous system for e-skins. Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human ... […]
- Greenbox And The Power Of Cannabis-Dispensing Robotson July 18, 2019 at 7:34 am
Powered by an interactive touch screen, Greenbox’s robots dispense multiple types and strains of cannabis products, from edibles to flower to CBD, in one transaction. The robots actively monitor ... […]
- Medical Robotic Systems Market to Touch US$ 25,738.8 Million by 2025; Surgical Robots Segment Dominates the Market - TMRon July 18, 2019 at 4:34 am
ALBANY, New York, July 18, 2019 /PRNewswire/ -- According to the new report published by Transparency Market research the global medical robotic systems market is expected to reach an overall ... […]
- Latest e-skin innovation gives robots and prosthetics an exceptional sense of touch (w/video)on July 18, 2019 at 2:41 am
(Nanowerk News) Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous ... […]
- New electronic skin system may give robots, prosthetics exceptional sense of touchon July 18, 2019 at 1:44 am
Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by ... […]
via Bing News