The FACS as we know it today was first published in , but was substantially updated in Using FACS, we are able to determine the displayed emotion of a participant. This analysis of facial expressions is one of very few techniques available for assessing emotions in real-time fEMG is another option. Other measures, such as interviews and psychometric tests, must be completed after a stimulus has been presented. This delay ultimately adds another barrier to measuring how a participant truly feels in direct response to a stimulus. Researchers have for a long time been limited to manually coding video recordings of participants according to the action units described by the FACS.
Facial Action Coding System (FACS) – A Visual Guidebook
Facial Coding Research Overview - Explorer Research
FACS is used across many different personal and professional settings. It is often used in various scientific settings for research. It is also used by animators and computer scientists interested in facial recognition. FACS may also enable greater awareness and sensitivity to subtle facial behaviors. Such skills are useful for psychotherapists, interviewers, and anyone working in communications. It also describes how AUs appear in combinations. The Paul Ekman Group offers the manual for sale.
Have you ever used a smiley face in a text message? When we look at a smiling face, we understand what is being communicated even in the absence of verbal communication. Interpreting facial expressions and other nonverbal communications through facial coding research is an increasingly popular marketing and research tool for gauging consumer sentiment.
Friesen , and published in Hager published a significant update to FACS in Due to subjectivity and time consumption issues, FACS has been established as a computed automated system that detects faces in videos, extracts the geometrical features of the faces, and then produces temporal profiles of each facial movement. Using FACS  human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific action units AU and their temporal segments that produced the expression.