Facial Reactions to Emotional Facial Expressions: Affect or Cognition? (original) (raw)
Related papers
Mimicry and the judgment of emotional facial expressions
Journal of Nonverbal behavior, 1999
Lipps (1907) presented a model of empathy which had an important influence on later formulations. According to Lipps, individuals tend to mimic an interaction partner's behavior, and this nonverbal mimicry induces—via a feedback process—the corresponding affective state in the observer. The resulting shared affect is believed to foster the understanding of the observed person's self. The present study tested this model in the context of judgments of emotional facial expressions. The results confirm that individuals mimic emotional facial ...
Emotional empathy as related to mimicry reactions at different levels of information processing
Journal of Nonverbal …, 2003
The hypotheses of this investigation were based on conceiving of facial mimicry reactions in face-to-face interactions as an early automatic component in the process of emotional empathy. Differences between individuals high and low in emotional empathy were investigated. The parameters compared were facial mimicry reactions, as represented by electromyographic (EMG) activity, when individuals were exposed to pictures of angry or happy faces. The present study distinguished between spontaneous facial reactions and facial expressions associated with more controlled or modulated emotions at different information processing levels, first at a preattentive level and then consecutively at more consciously controlled levels: 61 participants were exposed to pictures at three different exposure times (17, 56, and 2350 ms). A significant difference in facial mimicry reactions between high-and low-empathy participants emerged at short exposure times (56 ms), representing automatic, spontaneous reactions, with high-empathy participants showing a significant mimicking reaction. The low-empathy participants did not display mimicking at any exposure time. On the contrary, the low-empathy participants showed, in response to angry faces, a tendency to an elevated activation in the cheek region, which often is associated with smiling.
Automatic mimicry reactions as related to differences in emotional empathy
Scandinavian Journal of Psychology, 2002
The hypotheses of this investigation were derived by conceiving of automatic mimicking as a component of emotional empathy. Differences between subjects high and low in emotional empathy were investigated. The parameters compared were facial mimicry reactions, as represented by electromyographic (EMG) activity when subjects were exposed to pictures of angry or happy faces, and the degree of correspondence between subjects' facial EMG reactions and their self-reported feelings. The comparisons were made at different stimulus exposure times in order to elicit reactions at different levels of information processing. The high-empathy subjects were found to have a higher degree of mimicking behavior than the low-empathy subjects, a difference that emerged at short exposure times (17-40 ms) that represented automatic reactions. The low-empathy subjects tended already at short exposure times (17-40 ms) to show inverse zygomaticus muscle reactions, namely "smiling" when exposed to an angry face. The high-empathy group was characterized by a significantly higher correspondence between facial expressions and self-reported feelings. No differences were found between the high-and low-empathy subjects in their verbally reported feelings when presented a happy or an angry face. Thus, the differences between the groups in emotional empathy appeared to be related to differences in automatic somatic reactions to facial stimuli rather than to differences in their conscious interpretation of the emotional situation.
The current study addressed the hypothesis that empathy and the restriction of facial muscles of observers can influence recognition of emotional facial expressions. A sample of 74 participants recognized the subjective onset of emotional facial expressions (anger, disgust, fear, happiness, sadness, surprise, and neutral) in a series of morphed face photographs showing a gradual change (frame by frame) from one expression to another. The high-empathy (as measured by the Empathy Quotient) participants recognized emotional facial expressions at earlier photographs from the series than did low-empathy ones, but there was no difference in the exploration time. Restriction of facial muscles of observers (with plasters and a stick in mouth) did not influence the responses. We discuss these findings in the context of the embodied simulation theory and previous data on empathy. We would like to thank P. Yu. Medvedev (Dzerzhinsk, Nizhny Novgorod Oblast, Russia) for building the software for facial expression recognition.
Role of facial expressions in social interactions
Philosophical Transactions of the Royal Society B: Biological Sciences, 2009
The expressions we see in the faces of others engage a number of different cognitive processes. Emotional expressions elicit rapid responses, which often imitate the emotion in the observed face. These effects can even occur for faces presented in such a way that the observer is not aware of them. We are also very good at explicitly recognizing and describing the emotion being expressed. A recent study, contrasting human and humanoid robot facial expressions, suggests that people can recognize the expressions made by the robot explicitly, but may not show the automatic, implicit response. The emotional expressions presented by faces are not simply reflexive, but also have a communicative component. For example, empathic expressions of pain are not simply a reflexive response to the sight of pain in another, since they are exaggerated when the empathizer knows he or she is being observed. It seems that we want people to know that we are empathic. Of especial importance among facial expressions are ostensive gestures such as the eyebrow flash, which indicate the intention to communicate. These gestures indicate, first, that the sender is to be trusted and, second, that any following signals are of importance to the receiver.
Conceptual and methodological issues in the judgment of facial expressions of emotion
Motivation and Emotion, 1995
In two studies, subjects judged a set of facial expressions of emotion by either providing labels of their own choice to describe the stimuli (free-choice condition), choosing a label from a list of emotion words, or choosing a story from a list of emotion stories (fixed-choice conditions). In the free-choice condition, levels of agreement between subjects on the predicted emotion categories for six basic emotions were significantly greater than chance levels, and comparable to those shown in fixed-choice studies. As predicted, there was little to no agreement on a verbal label for contempt. Agreement on contempt was greatly improved when subjects were allowed to identify the expression in terms of an antecedent event for that emotion rather than in terms of a single verbal label, a finding that could not be attributed to the methodological artifact of exclusion in a fixed-choice paradigm. These findings support two conclusions: (1) that the labels used in fixed-choice paradigms accurately reflect the verbal categories people use when free labeling facial expressions of emotion, and (2) that lexically ambiguous emotions, such as contempt, are understood in terms of their situational meanings. Over the past 25 years numerous studies of literate and preliterate cultures have shown that anger, disgust, fear, happiness, sadness, and surprise are universally recognized (for reviews see
Journal of Personality and …, 1990
This study used a technique for assessing the relative impact of facial-gestural expressions, as opposed to contextual information regarding the elicitor and situation, on the judgment of emotion. In Study 1, 28 undergraduates rated videotapes of spontaneous facial-gestural expressions and separately rated the emotionally loaded color slides that elicited those expressions. The source clarities of the expressions and slides were matched using correlation and distance measures, and 18 expressions and 9 slides were selected. In Study 2, 72 undergraduate receivers were shown systematic pairings of these expressions and slides and rated the emotional state of the expresser, who was supposedly watching that slide under public or private situational conditions. Expressions were found to be more important sources for all emotion judgments. For female receivers slides were relatively more important in the public than in the private situation. sity of Connecticut. Wallbott, H. G. (1988). In and out of context: Influences of facial expression and context information on emotion attributions.
From face to face: the contribution of facial mimicry to cognitive and emotional empathy
Cognition and Emotion, 2019
Despite advances in the conceptualisation of facial mimicry, its role in the processing of social information is a matter of debate. In the present study, we investigated the relationship between mimicry and cognitive and emotional empathy. To assess mimicry, facial electromyography was recorded for 70 participants while they completed the Multifaceted Empathy Test, which presents complex contextembedded emotional expressions. As predicted, inter-individual differences in emotional and cognitive empathy were associated with the level of facial mimicry. For positive emotions, the intensity of the mimicry response scaled with the level of state emotional empathy. Mimicry was stronger for the emotional empathy task compared to the cognitive empathy task. The specific empathy condition could be successfully detected from facial muscle activity at the level of single individuals using machine learning techniques. These results support the view that mimicry occurs depending on the social context as a tool to affiliate and it is involved in cognitive as well as emotional empathy.
PloS one, 2016
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Actio...