Laterality and (in)visibility in emotional face perception: Manipulations in spatial frequency content (original) (raw)

Ratings of emotion in faces are influenced by the visual field to which stimuli are presented

Brain and Cognition, 1987

This experiment was designed to assess the differential impact of initially presenting affective information to the left versus right hemisphere on both the perception of and response to the input. Nineteen right-handed subjects were presented with faces expressing happiness and sadness. Each face was presented twice to each visual field for an 8-set duration. The electro-oculogram (EOG) was monitored and fed back to subjects to train them to keep their eyes focused on the central fixation point as well as to eliminate trials confounded by eye All rights of reproduction in any form reserved.

Ratings of emotion in faces are influenced by the visual field to which stimuli are presented*1

Brain and Cognition, 1987

This experiment was designed to assess the differential impact of initially presenting affective information to the left versus right hemisphere on both the perception of and response to the input. Nineteen right-handed subjects were presented with faces expressing happiness and sadness. Each face was presented twice to each visual field for an 8-set duration. The electro-oculogram (EOG) was monitored and fed back to subjects to train them to keep their eyes focused on the central fixation point as well as to eliminate trials confounded by eye All rights of reproduction in any form reserved.

Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli

Laterality: Asymmetries of Body, Brain and Cognition

This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralisation for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

Brain lateralization of holistic versus analytic processing of emotional facial expressions

NeuroImage, 2014

This study investigated the neurocognitive mechanisms underlying the role of the eye and the mouth regions in the recognition of facial happiness, anger, and surprise. To this end, face stimuli were shown in three formats (whole face, upper half visible, and lower half visible) and behavioral categorization, computational modeling, and ERP (event-related potentials) measures were combined. N170 (150-180 ms post-stimulus; right hemisphere) and EPN (early posterior negativity; 200-300 ms; mainly, right hemisphere) were modulated by expression of whole faces, but not by separate halves. This suggests that expression encoding (N170) and emotional assessment (EPN) require holistic processing, mainly in the right hemisphere. In contrast, the mouth region of happy faces enhanced left temporo-occipital activity (150-180 ms), and also the LPC (late positive complex; centro-parietal) activity (350-450 ms) earlier than the angry eyes (450-600 ms) or other face regions. Relatedly, computational modeling revealed that the mouth region of happy faces was also visually salient by 150 ms following stimulus onset. This suggests that analytical or part-based processing of the salient smile occurs early (150-180 ms) and lateralized (left), and is subsequently used as a shortcut to identify the expression of happiness (350-450 ms). This would account for the happy face advantage in behavioral recognition tasks when the smile is visible.

Hemispheric asymmetries in processing emotional expressions

Neuropsychologia, 1983

Three experiments are reported on visual field asymmetries in the perception ofemotional expressions on the face. In experiment 1 full faces expressing six different emotions were presented unilaterally for exposure durations, allowing the subject to judge whether the facial expression was positive or negative. Right-handed subjects judged all expressions except happiness as more negative when presented in the left visual field (LVF). This effect was smaller for left-handers and was absent in left-handers who use the non-inverted writing posture. In experiment II subjects were presented with happy, sad and "mixed" chimeric faces, projected to each visual field, for durations allowing only the detection of the existence of a face. LVF presentations produced greater differential rating of emotional valence for the three types of stimuli. In experiment III chimeric faces containing happy and sad expressions were presented unilaterally for durations allowing the subject to perceive the existence of two expressions on the face. The subjects were required to decide whether the mood expressed in the face was predominantly negative or positive. RVF presentations resulted in a bias toward positive judgments. These results indicate right hemispheric superiority for the perception and processing of emotional valence and a left hemispheric perceptual bias toward positive aspects of emotional stimuli.

Lateralized hybrid faces: Evidence of a valence-specific bias in the processing of implicit emotions

Laterality: Asymmetries of Body, Brain and Cognition, 2014

It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the lowfrequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

Emotional expressions evoke a differential response in the fusiform face area

Frontiers in Human Neuroscience, 2013

It is widely assumed that the fusiform face area (FFA), a brain region specialized for face perception, is not involved in processing emotional expressions. This assumption is based on the proposition that the FFA is involved in face identification and only processes features that are invariant across changes due to head movements, speaking and expressing emotions. The present study tested this proposition by examining whether the response in the human FFA varies across emotional expressions with functional magnetic resonance imaging and brain decoding analysis techniques (n = 11). A one vs. all classification analysis showed that most emotional expressions that participants perceived could be reliably predicted from the neural pattern of activity in left and the right FFA, suggesting that the perception of different emotional expressions recruit partially non-overlapping neural mechanisms. In addition, emotional expressions could also be decoded from the pattern of activity in the early visual cortex (EVC), indicating that retinotopic cortex also shows a differential response to emotional expressions. These results cast doubt on the idea that the FFA is involved in expression invariant face processing, and instead indicate that emotional expressions evoke partially de-correlated signals throughout occipital and posterior temporal cortex.

Decoding facial blends of emotion: Visual field, attentional and hemispheric biases

Brain and Cognition, 2013

Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced.

Functional asymmetry and interhemispheric cooperation in the perception of emotions from facial expressions

2006

The present study used the redundant target paradigm on healthy subjects to investigate functional hemispheric asymmetries and interhemispheric cooperation in the perception of emotions from faces. In Experiment 1 participants responded to checkerboards presented either unilaterally to the left (LVF) or right visual half field (RVF), or simultaneously to both hemifields (BVF), while performing a pointing task for the control of eye movements. As previously reported (Miniussi et al. in J Cogn Neurosci 10:216-230, 1998), redundant stimulation led to shorter latencies for stimulus detection (bilateral gain or redundant target effect, RTE) that exceeded the limit for a probabilistic interpretation, thereby validating the pointing procedure and supporting interhemispheric cooperation. In Experiment 2 the same pointing procedure was used in a go/no-go task requiring subjects to respond when seeing a target emotional expression (happy or fearful, counterbalanced between blocks). Faster reaction times to unilateral LVF than RVF emotions, regardless of valence, indicate that the perception of positive and negative emotional faces is lateralized toward the right hemisphere. Simultaneous presentation of two congruent emotional faces, either happy or fearful, produced an RTE that cannot be explained by probability summation and suggests inter-hemispheric cooperation and neural summation. No such effect was present with BVF incongruent facial expressions. In Experiment 3 we studied whether the RTE for emotional faces depends on the physical identity between BVF stimuli, and we set a second BVF congruent condition in which there was only emotional but not physical or gender identity between stimuli (i.e. two different faces expressing the same emotion). The RTE and interhemispheric cooperation were present also in this second BVF congruent condition. This shows that emotional congruency is the sufficient condition for the RTE to take place in the intact brain and that the cerebral hemispheres can interact in spite of physical differences between stimuli.

Emotion perception is mediated by spatial frequency content.

Emotion, 2011

Spatial frequencies have been shown play an important role in face identification, but very few studies have investigated the role of spatial frequency content in identifying different emotions. In the present study we investigated the role of spatial frequency in identifying happy and sad facial expressions. Two experiments were conducted to investigate (a) the role of specific spatial frequency content in emotion identification and (b) hemispherical asymmetry in emotion identification. Given links between global processing, happy emotions and low frequencies, we hypothesized that low spatial frequencies would be important for identifying the happy expression. Correspondingly, we also hypothesized that high spatial frequencies would be important in identifying the sad expression given the links between local processing, sad emotions and high spatial frequencies. As expected we found that the identification of happy expression was dependent on low spatial frequencies and the identification of sad expression was dependent on high spatial frequencies. There was a hemispheric asymmetry with the identification of sad expression especially in the right hemisphere possibly mediated by high spatial frequency content. Results indicate the importance of spatial frequency content in the identification of happy and sad emotional expressions and point to the mechanisms involved in emotion identification.