By Timothy Revell
15 March 2017
To make the robot and avatar, the researchers collected videos of people expressing pain, disgust and anger, and used face-tracking software to convert their expressions into a series of moving points. They then mapped these onto the robot and avatar faces. The robot used was Hanson Robotics’ Philip K. Dick, a humanoid modelled on the science fiction writer that has realistic rubber skin and can move its facial features.
To test how well people could perceive emotions from the simulated facial expressions, videos of the robot and avatar were shown to 102 volunteers, who had to judge which emotion matched which expression. Half the volunteers were clinicians, such as doctors, nurses and pharmacists, and half had no medical background.
The clinicians turned out to be less accurate than the non-clinicians at recognising both pain and anger. In the starkest difference, the clinicians correctly identified pain expressed by the virtual avatar only 54 per cent of the time, compared with 83 per cent for the non-clinicians.
This follows previous research that suggests doctors are worse at interpreting pain in humans than laypeople and tend to underestimate the severity. This could partly be a result of medical training decreasing levels of empathy.