Is there a role of visual cortex in spatial hearing? (original) (raw)
Related papers
The Journal of Neuroscience, 2012
In humans, whose ears are fixed on the head, auditory stimuli are initially registered in space relative to the head. Eventually, locations of sound sources need to be encoded also relative to the body, or in absolute allocentric space, to allow orientation toward the sounds sources and consequent action. We can therefore distinguish between two spatial representation systems: a basic head-centered coordinate system and a more complex head-independent system. In an ERP experiment, we attempted to reveal which of these two coordinate systems is represented in the human auditory cortex. We dissociated the two systems using the mismatch negativity (MMN), a well studied EEG effect evoked by acoustic deviations. Contrary to previous findings suggesting that only primary head-related information is present at this early stage of processing, we observed significant MMN effects for both head-independent and head-centered deviant stimuli. Our findings thus reveal that both primary head-relat...
Processing of auditory spatial cues in human cortex: An fMRI study
Neuropsychologia, 2006
The issue of where in the human cortex coding of sound location is represented still is a matter of debate. It is unclear whether there are cortical areas that are specifically activated depending on the location of sound. Are identical or distinct cortical areas in one hemisphere involved in processing of sounds from the left and right? Also, the possibility has not been investigated so far that distinct areas have a preference for processing of central and eccentric sound locations. The present study focussed on these issues by using functional magnetic resonance imaging (fMRI). Activations evoked by left, right and central sounds were analysed separately, and contrasts were computed between these conditions. We did not find areas, which were involved in the processing of exclusively left, right or central sound positions. Large overlapping areas rather were observed for the three sound stimuli, located in the temporal, parietal and frontal cortices of both hemispheres. This result argues for the idea of a widely distributed bilateral network accessing an internal representation of the body to encode stimulus position in relation to the body median plane. However, two areas (right BA 40 and left BA 37) also were found to have preferences for sound position. In particular, BA 40 turned out to be significantly more activated by processing central positions, compared to eccentric stimuli. In line with previous findings on visual perception, the latter observation supports the assumption that the right inferior parietal cortex may be preferentially involved in the perception of central stimulus positions in relation to the body.
Human Brain Mapping, 1999
Functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG) were used to study the relationships between lateralized auditory perception in humans and the contralaterality of processing in auditory cortex. Subjects listened to rapidly presented streams of short FM-sweep tone bursts to detect infrequent, slightly deviant tone bursts. The stimulus streams consisted of either monaural stimuli to one ear or the other or binaural stimuli with brief interaural onset delays. The onset delay gives the binaural sounds a lateralized auditory perception and is thought to be a key component of how our brains localize sounds in space. For the monaural stimuli, fMRI revealed a clear contralaterality in auditory cortex, with a contralaterality index (contralateral activity divided by the sum of contralateral and ipsilateral activity) of 67%. In contrast, the fMRI activations from the laterally perceived binaural stimuli indicated little or no contralaterality (index of 51%). The MEG recordings from the same subjects performing the same task converged qualitatively with the fMRI data, confirming a clear monaural contralaterality, with no contralaterality for the laterally perceived binaurals. However, the MEG monaural contralaterality (55%) was less than the fMRI and decreased across the several hundred millisecond poststimulus time period, going from 57% in the M50 latency range (20-70 ms) to 53% in the M200 range (170-250 ms). These data sets provide both quantification of the degree of contralaterality in the auditory pathways and insight into the locus and mechanism of the lateralized perception of spatially lateralized sounds.
Processing of sound location in human cortex
European Journal of Neuroscience, 2008
This functional magnetic resonance imaging study was focused on the neural substrates underlying human auditory space perception. In order to present natural-like sound locations to the subjects, acoustic stimuli convolved with individual head-related transfer functions were used. Activation foci, as revealed by analyses of contrasts and interactions between sound locations, formed a complex network, including anterior and posterior regions of temporal lobe, posterior parietal cortex, dorsolateral prefrontal cortex and inferior frontal cortex. The distinct topography of this network was the result of different patterns of activation and deactivation, depending on sound location, in the respective voxels. These patterns suggested different levels of complexity in processing of auditory spatial information, starting with simple left ⁄ right discrimination in the regions surrounding the primary auditory cortex, while the integration of information on hemispace and eccentricity of sound may take place at later stages. Activations were identified as being located in regions assigned to both the dorsal and ventral auditory cortical streams, that are assumed to be preferably concerned with analysis of spatial and non-spatial sound features, respectively. The finding of activations also in the ventral stream could, on the one hand, reflect the well-known functional duality of auditory spectral analysis, that is, the concurrent extraction of information based on location (due to the spectrotemporal distortions caused by head and pinnae) and spectral characteristics of a sound source. On the other hand, this result may suggest the existence of shared neural networks, performing analyses of auditory 'higher-order' cues for both localization and identification of sound sources.
European Journal of Neuroscience, 2002
Sounds convolved with individual head-related transfer functions and presented through headphones can give very natural percepts of the three-dimensional auditory space. We recorded whole-scalp neuromagnetic responses to such stimuli to compare reactivity of the human auditory cortex to sound azimuth and elevation. The results suggest that the human auditory cortex analyses sound azimuth, based on both binaural and monaural localization cues, mainly in the hemisphere contralateral to the sound, whereas elevation in the anterior space and in the lateral auditory space in general, both strongly relying on monaural spectral cues, are analyzed in more detail in the right auditory cortex. The binaural interaural time and interaural intensity difference cues were processed in the auditory cortex around 100±150 ms and the monaural spectral cues later around 200± 250 ms.
Journal of Neurophysiology, 2006
The aim of the current study was to measure the brain's response to auditory motion using electroencephalography (EEG) to gain insight into the mechanisms by which hemispheric lateralization for auditory spatial processing is established in the human brain. The onset of left- or rightward motion in an otherwise continuous sound was found to elicit a large response, which appeared to arise from higher-level nonprimary auditory areas. This motion onset response was strongly lateralized to the hemisphere contralateral to the direction of motion. The response latencies suggest that the ipsilateral response to the leftward motion was produced by indirect callosal projections from the opposite hemisphere, whereas the ipsilateral response to the rightward motion seemed to receive contributions from direct thalamocortical projections. These results suggest an asymmetry in the reliance on inter-hemispheric projections between the left and right auditory cortices for auditory spatial proc...
Linear processing of spatial cues in primary auditory cortex
Nature, 2001
study reinforces this notion by relating the changes in the angular declination below the horizon to perceived eye level, and demonstrating that the eye level serves as a reference for the visual system to compute the angular declination below the horizon. M Methods Observers Thirteen naive observers with informed consent and one author with self-reported normal vision participated in the various experiments.
Hemispheric competence for auditory spatial representation
Brain, 2009
Sound localization relies on the analysis of interaural time and intensity differences, as well as attenuation patterns by the outer ear. We investigated the relative contributions of interaural time and intensity difference cues to sound localization by testing 60 healthy subjects: 25 with focal left and 25 with focal right hemispheric brain damage. Group and single-case behavioural analyses, as well as anatomo-clinical correlations, confirmed that deficits were more frequent and much more severe after right than left hemispheric lesions and for the processing of interaural time than intensity difference cues. For spatial processing based on interaural time difference cues, different error types were evident in the individual data. Deficits in discriminating between neighbouring positions occurred in both hemispaces after focal right hemispheric brain damage, but were restricted to the contralesional hemispace after focal left hemispheric brain damage. Alloacusis (perceptual shifts across the midline) occurred only after focal right hemispheric brain damage and was associated with minor or severe deficits in position discrimination. During spatial processing based on interaural intensity cues, deficits were less severe in the right hemispheric brain damage than left hemispheric brain damage group and no alloacusis occurred. These results, matched to anatomical data, suggest the existence of a binaural sound localization system predominantly based on interaural time difference cues and primarily supported by the right hemisphere. More generally, our data suggest that two distinct mechanisms contribute to: (i) the precise computation of spatial coordinates allowing spatial comparison within the contralateral hemispace for the left hemisphere and the whole space for the right hemisphere; and (ii) the building up of global auditory spatial representations in right temporo-parietal cortices. Abbreviations: IID = interaural intensity difference; ITD = interaural time difference; LHD = left hemispheric damage; RHD = right hemispheric damage