A smiling emoticon next to a question mark

 

Joy. Sadness. Surprise. Fear. Disgust. Anger. These are all universal emotions, according to the theories of Psychologist Paul Ekman. Indeed, he also posits that contempt could also be a key emotion as well, although this is not as readily recognized as such by others.

Emotion Analysis of Peoples’ Faces

In The Universally Recognized Facial Expressions of Emotion I analyzed the research on the topic of facial expressions of emotion, and discovered that there wasn't universal agreement by the researchers and scientists. However, the work by Paul Ekman in the 1960s still had a considerable support base. Even the naysayers admitted that there were at least a few emotions that provided universally clear reactions.

So does emotion show in peoples' faces? It would appear some emotions show clearly, some appear as fleeting microexpressions, and others are totally internalized and can be relatively easily externally hidden from those around you.

Types of Interaction

There is another area to consider as well. In reality there are two types of interaction to observe when measuring facial emotions: human-to-computer interaction (for example watching movies online or surfing the internet) and human-to-human interaction. The later type of interaction can still occur when using a computer, for example when Skype-calling and teleconferencing. Research has shown that when humans interact with computers their expressions are more subtle by nature, than when compared to their human-to-human interactions, even when these happen when they are using the internet as their communications medium.

Which Emotions Are Measured by The Emotion Analysis API?

Internal research by Kairos, and the company that initially developed its Emotion Analysis API (IMRSV), discovered that not all of Ekman’s universal emotions provide consistently distinctive facial expressions. For example, most people struggle to differentiate between an angry person and a disgusted person, simply based on the reactions showing on the person's face. The traditional view of an angry face is that the eyebrows lower, lips press firmly and eyes bulge. The traditional view of a disgusted face is that the upper lip raises, the nose bridge wrinkles and the cheeks raise. The problem, in the real world, is that different people react in different ways to these two emotions, and the facial reactions overlap. This makes it virtually impossible to provide a definitive answer as to which of these two emotions the person in a picture is actually feeling.

You can, however, tell that the person is disliking something. It doesn't matter if they display the traditional anger symptoms, the traditional disgust symptoms, or somewhere inbetween, it is clear that they are feeling negative about the situation they are in. For that reason we use Negative as one of our emotional states in the Emotion Analysis API.

Similarly, we have found the traditional description of a change in facial muscle movement to depict sadness (lowering of the mouth corners, the eyebrows descending to the inner corners and the eyelids drooping), to be too generic, and exaggerated. This description has rarely been seen in real world experiments, and it would therefore be an unreliable analytics measurement were we to include it.

Even the traditional view of a “fear-look” is too stereotyped to be of much use in a emotional facial detection analysis. It is all very well to say that “the facial expression of fear has these distinctive features: raised eyebrows, tensed lower eyelids, eyebrows drawn together, lips stretched horizontally”, and in extreme panic it clearly does, but at milder levels of fear you do not see the same distinctive intensity.

Therefore what emotions can you confidently analyse from a picture or video of a face?

As I described above, it is possible to determine a person’s negative feelings towards something. At the opposite extreme, it is easy to tell if a person is happy - or at least if they are smiling.

There appears to be a reasonable consistent reaction to surprise. This is symbolized by eyebrows arching, eyes opening wide exposing more white, with the jaw dropping slightly. While higher degrees of surprise are likely to lead to more extreme facial reactions, the pattern tends to be the same.

A fourth state that can be determined is whether a person is being attentive or not. Obviously if a person’s eyes are wandering all over the place, they are not being very attentive. If their eyeline is consistent, is is quite likely that they are remaining focused on what they are doing, therefore increasing their attentiveness.

Of course, that begs the question, is being in an attentive state or not an emotion, or is it a symptom of an emotion? That is a topic that I am sure could have the emotion researchers pondering?

EMOTION ANALYSIS API

How does your audience really feel? Take guesswork out of engagement measurement.

Here is a demonstration of the Emotion Analysis API in operation, observing a driver in action.

 
 

Why don’t you have a go at using the Emotion Analysis API yourself.

 
 

I have shown a couple of demonstrations of the emotion analysis in action. What other potential uses can you think of? I hope that these examples may have inspired you to have a go yourself at creating an app that incorporates emotion analysis. How could you put it to good use in the real world? The next instalment in this series of emotion blogs is all about Emotion Analysis in the Real World.

 
 
 
 
 

FOR DEVELOPERS

Verify people in your apps—Integrate face recognition with our easy-to-code API.

CREATE ACCOUNT

FOR BUSINESSES

Discover the benefits of Kairos Face Recognition—Let's connect.

CONTACT SALES

WE LOVE TO TWEET

FOLLOW US

Ready to get started with Kairos?