A soft stroke of the upper arm. A nudge under the table. A punch in the face. There are many ways that humans communicate complex emotions with a simple touch; but robots have always been mostly oblivious to these social cues.Now, however, engineers at the University of Twente in the Netherlands are developing a system designed to recognise different types of social touch, and figure out what they might mean. They hope that one day it could help robots perform better in social situations.Using a prototype that combines a mannequin’s arm with 64 pressure sensors, Merel Jung and her team have identified four stages necessary for the robot to respond in the correct manner. It must perceive a touch, recognise it, interpret it, then respond in the correct way.
So far, most of the work has been on the first two of those stages – perceiving and recognising. In testing, the disembodied arm was able to recognise 60 percent of almost 8000 different touches. Those touches were distributed among 14 different kinds of touch, at three different levels of intensity…