Analysing micro-expressions from a (failed) user testing session

[unable to retrieve full-text content]

In a recent professional experience, I was unexpectedly sent to Spain for a user test of a system I did not design with the promise of a translator. Once there, no translator. Nice. But the possibility of capturing the video of the faces of the users and their interactions with the tested system.

When viewing it, I realise that Spanish people tend to be very expressive and communicate a lot through gestures and facial expressions. Unable to understand what they say with words, I used the FACS method initiated by Paul Ekman (Facial Action Coding System) to analyse user’s non-verbal communication and, more specifically, micro-expressions. Thanks to the simultaneous capture of the user’s faces and their interactions with the tested system, I was able to relate these micro-expressions to the interactions.

7 representations of the primary emotions and their micro-expressions : anger, contempt, disgust, fear, happiness, sadness, surprise.
Universal emotions and micro-expressions according to Paul Ekman’s Facial Action Coding System. Facial Action Coding System belong to Paul Ekman. For more information, visit https://www.paulekman.com/

I realised that through the non-verbal expression of their emotions, users communicate not only their states but also their potential behaviour towards the system being tested following these states.

Pros and cons of analysing micro-expressions in a user testing session

Why analysing micro-expression can be good for your UX Research study

  • The decoding of non-verbal communication cannot be done instinctively and must be objectivated (eg laughter or tears). If you think you can decode the emotional states of your users on instinct, remember that even the experts of Paul Ekman’s FACS method were unable to detect all of the micro-expressions expressed by a test subject. The main character of Lie to Me does not exist in real life; this is why decoding the micro-expressions and the emotions of users takes time and rigour.
  • A user who knows that he is being watched will tend to keep silent about certain comments he would like to make and not freely express his emotions. A field survey is always a social situation in itself.
  • The peculiarity of micro-expressions is that they are involuntary and uncontrollable. They betray the true emotions of the users.
  • Many facial expressions reflect universal emotions, it is a message that is not transformed by culture or language.
  • Micro-expressions are temporalized. Very brief (by definition), it is often easy to relate them to a specific interaction if the study support allows it.
  • By communicating their emotional states, users also communicate their probable behaviour in response to these states (abandonment, rejection, etc.).
  • A micro-expression can be linked to an emotion but also (admittedly less objectively) be interpreted according to an intensity scale.
  • Observation alone allows information to be gathered. The method is therefore non-intrusive in itself.

The limits of the analysis of micro-expressions

  • Almost undetectable to the naked eye, they require the use of a video recording of the users’ faces and their interactions with the system under test, and careful viewing.
  • The analysis of micro-expressions is like food supplements: their analysis does not exempt from applying the complete study methodology.
  • Like any empirical observation in UX Research, they do not represent a conclusion in themselves but must be explained.
  • The human face often presents mixtures of facial expressions (example of surprise), which complicates the task of decoding.
  • Micro-expression analysis systems only consider expressions related to emotions. This presupposes an analysis focused on the emotional experience more than functional, even if bridges can be made.

Universal emotions and micro-expression decoding in a nutshell

Ekman’s universal emotions

What part of determinism in the emotional definition of human beings? To what extent are these emotions formatted by society, culture? This is the question Paul Ekman (like other scholars before him) attempted to answer. Through his work, which took place all over the world, Paul Ekman has shown that certain emotions are common to all of humanity.

Accordingly to his work, there are 7 of these universal emotions: Anger, Contempt, Disgust, Enjoyment, Fear, Sadness and Surprise. You can learn more about these emotions on the dedicated page of the Ekman’s offical website but I strongly recommend you to buy his latest book if you are really interested. To each of these emotions corresponds a physical expression, also universal, which is recognisable thanks to the Facial Action Coding System method of the same author.

“Emotions change the way we see the world and how we interpret the actions of others.” — Paul Ekman

Ekman’s Facial Action Coding System

In Paul Ekman’s Facial Action Coding System (FACS) method, each facial movement (called AU: Action Unit) corresponds to a code. For exemple, Brow lowerer=4 ; Nose wrinkler=9 ; Lip corner depressor=15 ; Chin raiser=17…

These AUs can be combined with others. Thus, to each micro-expression corresponds a code expressed by the combination of several AUs. For example, Happiness is codified by 6+12 as 6=Cheek raiser and 12=Lip corner puller.

The peculiarity of this approach is that the subject on which we study the micro-expressions will sometimes express a mixture of several expressions. If for example the person shows Cheek raiser (6), Lip corner puller (12) and Brow lowerer (4), it is undoubtedly that his expression of Happiness is blended with a second emotion. The code for this micro-expression will be 6+12+4 and should be interpreted differently.

To go further, it is also possible in Paul Ekman’s Facial Action Coding System method to code the intensity of an observed Action Unit (AU). For this, we add to the number of the AU a letter going from A to E (A=Trace, E=Maximum).

There are many other subtleties in this method of coding micro-expressions that I will not detail in this article because at my level it is not necessary to go so far. If this area is unknown to you and you wish to explore it I encourage you to consult the page dedicated to this subject on the official website of Paul Ekman. You can also consult the Wikipedia page of the Facial Action Coding System (FACS) to find out how the coding of micro-expressions works.

Bringing out insights from a failed user testing by analysing micro-expressions

Always capture video whenever possible! As I replayed the test videos, I noticed that users had very brief physical reactions when interacting with the system under test. I paused the video and scrolled through the frames one by one. I realized that while the users had a soft general emotional expression during the entire test, they had for the time of a frame or two a very exaggerated facial expression. This exacerbation was often concomitant with a specific interaction: the discouragement of a new page, the reaction or non-reaction to the click of a button. They were micro-expressions.

Note : I used Open Broadcaster Software to simultaneously record screen and users faces. In the software you can place zones as you want and resize them to have 2/3 on the video dedicated to screen capture (to capture interactions) and 1/3 dedicated to user face. Kind reminder : you can crop zones on your OBS canvas by resizing them while pressing “option” on Mac.

On the left a screen capture, on the right the face of the user.
I used Open Broadcaster Software to simultaneously record screen and users faces
A screenshot of the OBS software
Kind reminder : you can crop zones on your OBS canvas by resizing them while pressing “option” on Mac.

A micro-expression refers to the expression of emotion through the muscles of the face. Accordingly to Paul Ekman’s work, because micro-expressions betrays the person’s real emotional state, without a social filter, it is always very brief and lasts less than half a second. After this half-second, the person regains control of their facial expressions to filter what they express; this is because social pressure often prevents us from freely expressing negative emotions such as anger or fear.

It is therefore very interesting to watch users betray their true emotional states for less than half a second. When this happens, it is often indicative of a lot.

Note that a user can also express emotion without it being a micro-expression. If this expression lasts longer than half a second, it is no longer a micro-expression. It is nonetheless valid in itself but we must keep in mind that it could be falsified by the person, consciously or not, for reasons of modesty or social pressure. The peculiarity of micro-expressions is that they are never falsified.

Once a micro-expression has been captured, it is possible (if the study medium allows it) to link it to an interaction. By analyzing micro-expression, therefore by revealing the emotion it communicates and its intensity, we can understand what the user’s reaction is to this specific interaction. If the user is expressing anger, it may be the result of frustration due to the inability to complete a task. Even if the user does not talk about their anger state, this is an issue to consider.

on the left a screen capture with interactive calendar, on the right the face of the user with expression of fear
On one frame, the user expresses fear when opening a calendar selector. During 0.5 second an intense fear were readable on lowered eyebrows and mouth. Main reaction to fear is for the user to get away.
on the left a screen capture with an error on captcha, on the right the face of the user with expression of sadness
On another frame, a user expresses sadness after 3 failed attempts on a Captcha validation. Her eyebrows and lip corner lowered.
on the left a screen capture with very long dropdown, on the right the face of the user with expression of disgust
Another micro-expression were spotted when one user opened a dropdown list that was way too long. We could see the nose wrinkles and mouth of disgust. Reading disgust tell us that for this user this dropdown is functional but non-esthetic.
on the left a screen capture with errors, on the right the face of the user with expression of anger
Anger were spotted when multiple errors popped on form validation. It seems the error messages weren’t explicit enough, and user felt blocked, triggering emotion of anger, readable on the eyebrow tension forming vertical marks and mouth closing.
on the left a screen capture with a large popup, on the right the face of the user with expression of surprise
This actually didn’t happened during user test but illustrate how surprise can be triggered in the user emotional experience. Remember surprise is rarely a good emotion for users, it is often mixed with fear.
on the left a screen capture with a completed task, on the right the face of the user with expression of happiness
The system we tested was not designed at all, it was made by developers only and was non-ergonomic and non-esthetic. That’s why we didn’t spotted happiness but if we did, it could have been at the end of the user experience.

Depending on the emotion expressed by the user, the danger to the system is not the same. An angry user could lose control and leave their route. In extreme cases, he might even have an aggressive gesture towards his device. A user expressing disgust is very likely to want to get away from the system. His disgust will most often be generated by unpleasant sensory stimuli (visual, auditory, haptic). If this problem appears, there are user testing methods geared towards analyzing these sensory stimuli that can correct it.

A surprised user might also want to get away from the system. Even if the surprise is followed by a positive feeling, the experience of the surprise itself may later generate the anticipation of a feeling of fear, in which case the user would keep a negative experience in memory. The expression of happiness in your users is the only positive expression that can be analysed according to the work of Paul Ekman. In this case, the elements identified as being at the origin of this expression of happiness must be preserved.

How was this analysis received once it was presented to stakeholders, businesses and sponsors

For restitution, I presented on each slide a micro-expression, the FACS code associated, an explanation of the context and an UX recommandation.

Powerpoint slide with a micro-expression, the FACS code associated, an explanation of the context and an UX recommandation.
For restitution, I presented the micro-expression, the FACS code associated, an explanation of the context and an UX recommandation.

The user test was carried out with 7 users, men and women, spread over 2 use scenarios. Cross-asked questions asked during the test allowed us to collect important information, to which were added around twenty frames describing a micro or macro-expression. The restitution was very well received by the business and sponsors who saw an innovative approach in this process. Unfortunately, the test was incomplete due to technical issues that prevented some users from moving forward, or because some users who had to self-administrate the test skipped questions. This made it possible to generate a total of 18 recommendations sorted by criticality and to refine the Personae thanks to the crossed questions.

Do you have any question or feedback? Meet me in the comments or visit amat-design.com. Originally published at http://amat-design.com on June 9, 2021.

The UX Collective donates US$1 for each article we publish. This story contributed to World-Class Designer School: a college-level, tuition-free design school focused on preparing young and talented African designers for the local and international digital product market. Build the design community you believe in.


Analysing micro-expressions from a (failed) user testing session was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.

Leave a comment

Your email address will not be published.