BEHAVIORAL AND COGNITIVE NEUROSCIENCE REVIEWS Adolphs / RECOGNIZING EMOTION FROM FACES Recognizing Emotion From Facial Expressions: Psychological and Neurological Mechanisms (original) (raw)
Related papers
Processing Faces and Facial Expressions
This paper reviews processing of facial identity and expressions. The issue of independence of these two systems for these tasks has been addressed from different approaches over the past 25 years. More recently, neuroimaging techniques have provided researchers with new tools to investigate how facial information is processed in the brain. First, findings from "traditional" approaches to identity and expression processing are summarized. The review then covers findings from neuroimaging studies on face perception, recognition, and encoding. Processing of the basic facial expressions is detailed in light of behavioral and neuroimaging data. Whereas data from experimental and neuropsychological studies support the existence of two systems, the neuroimaging literature yields a less clear picture because it shows considerable overlap in activation patterns in response to the different face-processing tasks. Further, activation patterns in response to facial expressions support the notion of involved neural substrates for processing different facial expressions.
Neuropsychologia, 2007
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120 ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Mediterranean Journal of Clinical Psychology, 2019
In this paper it will be investigated the distinction – supported by experimental data – of two different degrees within the so-called face perception : 1) The automatic perception/detection of faces; 2) The recognition of a specific face , that concerns personal meanings association – a story , we could say – to that first automatic perceptual configuration. In general, the first degree is a basic perception process, a universal , innate and early capacity belonging to all human beings. It includes three face-selective regions in the brain, with the OFA and the STS who process the partial data of faces, and the FFA that “produces” the overall basic form. The second degree consists in a complex recognition process, which implies the activation of many cerebral areas with different functions such as, for example, the subcortical regions responsible for emotions modulation ( amygdala , insula ), the intraparietal sulcus , the auditory cortex. It associates a given perceptual pattern w...
Facial expression and selective attention
Current Opinion in Psychiatry, 2002
Recent findings demonstrate that faces with an emotional expression tend to attract attention more than neutral faces, especially when having some threat-related value (anger or fear). These findings suggest that discrimination of emotional cues in faces can at least partly be extracted at preattentive or unconscious stages of processing, and then serve to enhance awareness and behavioural responses toward emotionally relevant stimuli. Functional neuroimaging results have begun to delineate brain regions whose response to threat-related expressions is independent of voluntary attention (e.g. amygdala and orbitofrontal cortex), and other regions whose response occurs only with attention (e.g. superior temporal and anterior cingulate cortex). Moreover, visual responses in the fusiform cortex are enhanced for emotional faces, consistent with their greater perceptual saliency. Recent data from event-related evoked potentials and neurophysiology also suggest that rapid processing of emotional information may not only occur in parallel to, but promote a more detailed perceptual analysis of, sensory inputs and thus bias competition for attention toward the representation of emotionally salient stimuli. Curr Opin Psychiatry
Human brain potentials related to the emotional expression, repetition, and gender of faces
Psychobiology, 1998
Event-related potentials were recorded from 20 healthy male subjects in response to a large number of color slides of unfamiliar faces with happy, sad, or no emotional expression. In an initial task, the subjects rated the emotional valence ofthe faces with ajoystick. In comparison with neutral faces, both happy and sad faces evoked a larger lateral occipito-temporal negativity from 200 to 400 msec poststimulus onset. Modulation of late positive complex (LPC: 450-600 msec) by emotional expressions was observed at the frontal sites only in this task, when attention to the emotional valence was required, In a second task, the subjects detected repeating faces among nonrepeating, novel faces. Emotionally expressive faces evoked more negative potential than neutral faces occipito-temporally between 270 and 540 msec latency, Although repetition had a large effect ill decreasing the N4 and increasing the LPC, it did not interact with emotional expression, supporting previously proposed independence between processing of a face identity and emotional expression. These fmdings imply that emotional expression affects early perceptual stages as well as later cognitive stages of face processing. Nonrepeated male faces in both tasks evoked a larger late negativity than female faces. A large body ofiiterature concerning perceptive, expressive, physiological, emotional, social, and other aspects of facial communication has emphasized its exceptional importance in social interactions. Moreover, face-selective brain circuitry has apparently evolved as a natural adaptation to accommodate the need to interact quickly and reliably, predating the very recent development of humanspecific verbal communication. Convergent evidence points at the basal temporo-occipital cortex (fusiform gyrus), which, as a part of the ventral processing stream, is important in face processing. Patients with lesions in this area are deficient in face recognition and are diagnosed with prosopagnosia (Damasio, Damasio, & Tranel, 1990; Meadows, 1974). Brain imaging techniques such as positron emission tomography and functional magnetic resonance imaging have verified the face-selective activation of this area (
Proceedings of the National Academy of Sciences, 2008
The ability to perceive and differentiate facial expressions is vital for social communication. Numerous functional MRI (fMRI) studies in humans have shown enhanced responses to faces with different emotional valence, in both the amygdala and the visual cortex. However, relatively few studies have examined how valence influences neural responses in monkeys, thereby limiting the ability to draw comparisons across species and thus understand the underlying neural mechanisms. Here we tested the effects of macaque facial expressions on neural activation within these two regions using fMRI in three awake, behaving monkeys. Monkeys maintained central fixation while blocks of different monkey facial expressions were presented. Four different facial expressions were tested: (i) neutral, (ii) aggressive (open-mouthed threat), (iii) fearful (fear grin), and (iv) submissive (lip smack). Our results confirmed that both the amygdala and the inferior temporal cortex in monkeys are modulated by facial expressions. As in human fMRI, fearful expressions evoked the greatest response in monkeyseven though fearful expressions are physically dissimilar in humans and macaques. Furthermore, we found that valence effects were not uniformly distributed over the inferior temporal cortex. Surprisingly, these valence maps were independent of two related functional maps: (i) the map of "face-selective" regions (faces versus non-face objects) and (ii) the map of "face-responsive" regions (faces versus scrambled images). Thus, the neural mechanisms underlying face perception and valence perception appear to be distinct.
Modulation of Early Perceptual Processing by Emotional Expression and Acquired Valence of Faces
Journal of Psychophysiology, 2012
Modulation of early perceptual processing by emotional expression and the affective valence of faces was explored in an eventrelated potential (ERP) study. An associative procedure was used where different neutral faces changed to happy, to angry or, in a control condition, stayed the same. Based on these changes in expression, participants had then to identify each neutral face as belonging to a friendly, hostile, or neutral individual. ERP measures revealed modulations at occipital-temporal sites of the P100 and N170 components by both the emotional expression and the valence of the associated neutral faces. The early posterior negativity (EPN) component, however, was only sensitive to emotional expression. These results are consistent with previous findings showing that emotional expression influences face perception since early stages of visual processing and provide new evidence that this influence can also be transferred to neutral faces through associative learning.
Brain Research, 2011
Recognition and processing of emotional facial expression are crucial for social behavior and employ higher-order cognitive and visual working processes. In neuropsychiatric disorders, impaired emotion recognition most frequently concerned three specific emotions, i.e., anger, fear, and disgust. As incorrect processing of (neutral) facial stimuli per se might also underlie a v a i l a b l e a t w w w . s c i e n c e d i r e c t . c o m w w w . e l s e v i e r . c o m / l o c a t e / b r a i n r e s
Emotional expressions evoke a differential response in the fusiform face area
Frontiers in Human Neuroscience, 2013
It is widely assumed that the fusiform face area (FFA), a brain region specialized for face perception, is not involved in processing emotional expressions. This assumption is based on the proposition that the FFA is involved in face identification and only processes features that are invariant across changes due to head movements, speaking and expressing emotions. The present study tested this proposition by examining whether the response in the human FFA varies across emotional expressions with functional magnetic resonance imaging and brain decoding analysis techniques (n = 11). A one vs. all classification analysis showed that most emotional expressions that participants perceived could be reliably predicted from the neural pattern of activity in left and the right FFA, suggesting that the perception of different emotional expressions recruit partially non-overlapping neural mechanisms. In addition, emotional expressions could also be decoded from the pattern of activity in the early visual cortex (EVC), indicating that retinotopic cortex also shows a differential response to emotional expressions. These results cast doubt on the idea that the FFA is involved in expression invariant face processing, and instead indicate that emotional expressions evoke partially de-correlated signals throughout occipital and posterior temporal cortex.