Researchers looked at algorithms designed to read a person’s emotions and found many were ‘almost always wrong’.
Facial expressions have generally been thought to reliably reflect a person’s innermost emotions, but new research indicates otherwise. Based on preliminary findings presented at a science conference in the US, researchers have gone as far as to say “it might be more accurate to say we should never trust a person’s face”.
Aleix Martinez, a professor of electrical and computer engineering at The Ohio State University in the US, said: “The question we really asked is: ‘Can we truly detect emotion from facial articulations?’.
“And the basic conclusion is, no, you can’t.”
‘Everyone makes different facial expressions based on context and cultural background’
– ALEIX MARTINEZ
He also described the technology many companies use to recognise facial muscle movements and assign emotion or intent to those movements as “complete baloney”. Martinez and his team analysed 4m facial expressions from 35 different countries. They found that attempts to define emotions-based facial expressions “were almost always wrong”.
Martinez said: “Everyone makes different facial expressions based on context and cultural background. And it’s important to realise that not everyone who smiles is happy. Not everyone who is happy smiles.
“I would even go to the extreme of saying most people who do not smile are not necessarily unhappy. And if you are happy for a whole day, you don’t go walking down the street with a smile on your face. You’re just happy.”
The researchers also looked at algorithms some companies use to determine customer satisfaction and other human emotions through facial expressions.
Martinez said: “Some claim they can detect whether someone is guilty of a crime or not, or whether a student is paying attention in class, or whether a customer is satisfied after a purchase. What our research showed is that those claims are complete baloney.
“There’s no way you can determine those things. And worse, it can be dangerous.”
According to Martinez, the so-called danger lies in the possibility of missing the real emotion or intent of a person and then making decisions based on an assumption.
After analysing the data they gathered about facial expressions and emotion, the research team concluded that it takes more than facial expressions to correctly detect emotion and facial colour, body posture and, more importantly, context all need to be taken into consideration.
The findings were presented at the annual meeting of the American Association for the Advancement of Science in Seattle.
– PA Media