Artificial Intelligence: Why it Can’t Detect the Correlation Between Human Emotion and Facial Expression – Science Times

Understanding Artificial Intelligence

(Photo: Andrea Piacquadio from Pexels)

Artificial intelligence, or AI, is a machine's attempt to simulate human intelligence. Often this refers to only one component of AI, Machine learning. This technology requires a foundation of specialized software and hardware for writing and training the machine's complex algorithms. Although no one programing language is expected for AI, Python, R, and Java are popular options, explains BuiltIn.

AI systems, in general, work by ingesting a vast amount of labeled training data, analyzing the information for patterns and correlations, and use these patterns to create predictions. Chatbots, as an example, are fed a myriad of text chat examples to learn to produce lifelike conversations with people. Likewise, image recognition tools learn to identify and describe objects after examining millions of examples. Simply put, AI focus on 3 cognitive skills, learning, self-correction, and reasoning.

ALSO READ: Self-Healing Earth: Volcanoes Regulate Carbon Emissions in Atmosphere, Global Temperatures; Possible Answer to Climate Change

A study published in the journalNature Communications, titled "Professional Actors Demonstrate Variability, Not Stereotypical Expressions When Portraying the Emotional States In Photographs," analyzed various actor photos to examine the complex relationship between human emotions and facial expressions. The team found that people can use similar expressions in portraying different emotions. Likewise, the same emotion can be expressed in various ways. Additionally, researchers found that that inference depended on context. Hence, judging people's inner workings based on analyzing facial expressions through a complex algorithm is flawed.

Researchers used 13 emotion categories to analyze facial expressions from over 60 photographs of actors that were given emotion-evoking scenes to react to. However, descriptions for the scenes did not elaborate on which emotion to feel. The categories were made via the judgment of about 800 volunteers and the help of a Facial Action Coding System relating action unites of movement in facial muscles. The machine learning analysis thus revealed that the actors portrayed the same emotion categories via contouring their faces in various ways. Likewise, similar expressions didn't always reveal the same human emotion.

The study was run with two groups. One, having more than 840 volunteers marked about 30 faces under each of the 13 emotion categories. While the second group of 845 people was able to rate about 30 face-and-scenario pairs. Results from the two groups mostly differed. This led to researchers' conclusion that analyzing human facial expressions out of context will lead to misleading judgments. Thus, the context of these emotional intentions of people was of the utmost importance.

Lisa Feldman Barrett, the lead author of the study and a psychology professor at Northeastern University, says that the research directly counters the traditional approach of AI with emotions reports Gadgets360.

RELATED ARTICLE: Robot Beetle Now Part of the Guinness Book of World Records as the Lightest Crawling Robot Ever Invented

Check out more news and information on Tech & Innovationon Science TImes.

More:
Artificial Intelligence: Why it Can't Detect the Correlation Between Human Emotion and Facial Expression - Science Times

Related Posts

Comments are closed.