Facial Action Coding System Research Papers (original) (raw)
The facial action coding system (FAGS) is an objective method for quantifying facial movement in terms of component actions. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images.... more
The facial action coding system (FAGS) is an objective method for quantifying facial movement in terms of component actions. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include: analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions
The facial action coding system (FACS) was used to examine recognition rates in 105 healthy young men and women who viewed 128 facial expressions of posed and evoked happy, sad, angry and fearful emotions in color photographs balanced for... more
The facial action coding system (FACS) was used to examine recognition rates in 105 healthy young men and women who viewed 128 facial expressions of posed and evoked happy, sad, angry and fearful emotions in color photographs balanced for gender and ethnicity of poser. Categorical analyses determined the specificity of individual action units for each emotion. Relationships between recognition rates for different emotions and action units were evaluated using a logistic regression model. Each emotion could be identified by a group of action units, characteristic to the emotion and distinct from other emotions. Characteristic happy expressions comprised raised inner eyebrows, tightened lower eyelid, raised cheeks, upper lip raised and lip corners turned upward. Recognition of happy faces was associated with cheek raise, lid tightening and outer brow raise. Characteristic sad expressions comprised furrowed eyebrow, opened mouth with upper lip being raised, lip corners stretched and turned down, and chin pulled up. Only brow lower and chin raise were associated with sad recognition. Characteristic anger expressions comprised lowered eyebrows, eyes wide open with tightened lower lid, lips exposing teeth and stretched lip corners. Recognition of angry faces was associated with lowered eyebrows, upper lid raise and lower lip depression. Characteristic fear expressions comprised eyes wide open, furrowed and raised eyebrows and stretched mouth. Recognition of fearful faces was most highly associated with upper lip raise and nostril dilation, although both occurred infrequently, and with inner brow raise and widened eyes. Comparisons are made with previous studies that used different facial stimuli.
1. In the present research, we test the assumption that emotional mimicry and contagion are moderated by group membership. We report two studies using facial electromyography (EMG; Study 1), Facial Action Coding System (FACS; Study 2),... more
1. In the present research, we test the assumption that emotional mimicry and contagion are moderated by group membership. We report two studies using facial electromyography (EMG; Study 1), Facial Action Coding System (FACS; Study 2), and self-reported emotions (Study 2) as dependent measures. As predicted, both studies show that ingroup anger and fear displays were mimicked to a greater extent than outgroup displays of these emotions. The self-report data in Study 2 further showed specific divergent reactions ...
- by B. Doosje
- •
- Psychology, Cognitive Science, Emotion, Anger
We present ongoing work on a project for automatic recognition of spon- taneous facial actions. Spontaneous facial expressions differ substan- tially from posed expressions, similar to how continuous, spontaneous speech differs from... more
We present ongoing work on a project for automatic recognition of spon- taneous facial actions. Spontaneous facial expressions differ substan- tially from posed expressions, similar to how continuous, spontaneous speech differs from isolated words produced on command. Previous methods for automatic facial expression recognition assumed images were collected in controlled environments in which the subjects delib- erately faced the camera.
- by Kenneth Prkachin
- •
- Emotion, Pain, Human Development, Face
- by Etienne Roesch and +2
- •
- Psychology, Cognitive Science, Nonverbal Behavior, Facial expression
- by Kenneth Prkachin and +1
- •
- Pain, Facial expression, Physical Therapy, Informal Communication
We develop an automatic system to analyze subtle changes in upper face expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal image... more
We develop an automatic system to analyze subtle changes in upper face expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal image sequence. Our system recognizes fine-grained changes in facial expression based on Facial Action Coding System (FACS) action units (AUs). Multi-state facial component models are proposed for tracting and modeling different facial features, including eyes, brews, cheeks, and furrows. Then we ...
A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with... more
A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with psychologically-defined abstract dimensions, and the latter is coded by the Facial Action Coding System. In order to obtain the mapping, parallel data with rated emotional states and facial
Drowsy driver detection is one of the potential applications of intelligent vehicle systems. Previous approaches to drowsiness detection primarily make pre-assumptions about the relevant behavior, focusing on blink rate, eye closure, and... more
Drowsy driver detection is one of the potential applications of intelligent vehicle systems. Previous approaches to drowsiness detection primarily make pre-assumptions about the relevant behavior, focusing on blink rate, eye closure, and yawning. Here we employ machine learning to datamine actual human behavior during drowsiness episodes. Automatic classifiers for 30 facial actions from the facial action coding system were developed using machine learning on a separate database of spontaneous expressions. These facial actions include blinking and yawn motions, as well as a number of other facial movements. These measures were passed to learning-based classifiers such as Adaboost and multinomial ridge regression. Head motion information was collected through automatic eye tracking and an accelerometer. The system was able to predict sleep and crash episodes on a simulator with 98% accuracy across subjects. It is the highest prediction rate reported to date for detecting drowsiness. Moreover, the analysis revealed new information about human facial behavior for drowsy drivers.