New research shows Artificial Intelligence still lags behind humans when it comes to recognising emotions – Irish Tech News

New DCU led research into the accuracy of artificial intelligence when it comes to reading emotions on our faces has shown that it still lags behind human observers when it comes to being able to tell whether were happy or sad. The difference was particularly pronounced when it came to spontaneous displays of emotion.

The recently published study, A performance comparison of eight commercially available automatic classifiers for facial affect recognition, looked at eight out of the box automatic classifiers for facial affect recognition (artificial intelligence that can identify human emotions on faces) and compared their emotion recognition performance to that of human observers.

It found that the human recognition accuracy of emotions was 72% whereas, among the artificial intelligence tested, the researchers observed a large variance in recognition accuracy, ranging from 48% to 62%.

The work was conducted by Dr Damien Dupr from Dublin City Universitys Business School, Dr Eva Krumhuber from the Department of Experimental Psychology at UCL, Dr Dennis Kster from the Cognitive Systems Lab, University of Bremen and Dr Gary J. McKeown from the Department of Psychology at Queens University Belfast.

Eight out-of-the-box automatic classifiers tested.

937 videos were sampled from two large databases that conveyed the basic six emotions (happiness, sadness, anger, fear, surprise, and disgust).

The study examined both posed and spontaneous emotions.

Results revealed a significant recognition advantage for human observers over automatic classification (72% for human observers)

Among the eight classifiers, there was considerable variance in recognition accuracy ranging from 48% to 62%.

Classification accuracy for AI was consistently lower for spontaneous affective behaviour.

The findings indicate shortcomings of existing out-of-the-box classifiers for measuring emotions.

Two well-known dynamic facial expression databases were chosen: BU-4DFE from Binghamton University in New York and the other from The University of Texas in Dallas.

Both are annotated in terms of emotion categories and contain either posed or spontaneous facial expressions. All of the examined expressions were dynamic to reflect the realistic nature of human facial behaviour.

To evaluate the accuracy of emotion recognition, the study compared the performance achieved by human judges with those of eight commercially available automatic classifiers.

Dr Damien Dupr said:

AI systems claiming to recognise humans emotions from their facial expressions are now very easy to develop. However, most of them are based on inconclusive scientific evidence that people are expressing emotions in the same way.

For these systems, human emotions come down to only six basic emotions, but they do not cope well with blended emotions.

Companies using such systems need to be aware that the results obtained are not a measure of the emotion felt, but merely a measure of how much ones face matches with a face supposed to correspond to one of these six emotions.

Co-author Dr Eva Krumhuber from UCL added:

AI has come a long way in identifying peoples facial expressions, but our research suggests that there is still room for improvement in recognising genuine human emotions.

Dr Krumhuber recently led a separate study published in Emotion (also involving Dr Kster) comparing human vs. machine recognition across fourteen different databases of dynamic facial expressions.

Researchers

Dr Damien Dupr Business School, Dublin City University

Dr Eva Krumhuber Department of Experimental Psychology, UCL

Dr Dennis Kster Cognitive Systems Lab, University of Bremen

Dr Gary J. McKeown Department of Psychology, Queens University Belfast

Read more:
New research shows Artificial Intelligence still lags behind humans when it comes to recognising emotions - Irish Tech News

Related Posts

Comments are closed.