Right-hemisphere auditory cortex is dominant for coding syllable patterns in speech - PubMed (original) (raw)
Comparative Study
Right-hemisphere auditory cortex is dominant for coding syllable patterns in speech
Daniel A Abrams et al. J Neurosci. 2008.
Abstract
Cortical analysis of speech has long been considered the domain of left-hemisphere auditory areas. A recent hypothesis poses that cortical processing of acoustic signals, including speech, is mediated bilaterally based on the component rates inherent to the speech signal. In support of this hypothesis, previous studies have shown that slow temporal features (3-5 Hz) in nonspeech acoustic signals lateralize to right-hemisphere auditory areas, whereas rapid temporal features (20-50 Hz) lateralize to the left hemisphere. These results were obtained using nonspeech stimuli, and it is not known whether right-hemisphere auditory cortex is dominant for coding the slow temporal features in speech known as the speech envelope. Here we show strong right-hemisphere dominance for coding the speech envelope, which represents syllable patterns and is critical for normal speech perception. Right-hemisphere auditory cortex was 100% more accurate in following contours of the speech envelope and had a 33% larger response magnitude while following the envelope compared with the left hemisphere. Asymmetries were evident regardless of the ear of stimulation despite dominance of contralateral connections in ascending auditory pathways. Results provide evidence that the right hemisphere plays a specific and important role in speech processing and support the hypothesis that acoustic processing of speech involves the decomposition of the signal into constituent temporal features by rate-specialized neurons in right- and left-hemisphere auditory cortex.
Figures
Figure 1.
Left column, Grand average cortical responses from three matched electrode pairs and broadband speech envelope for “clear” stimulus condition. The black lines represent the broadband speech envelope for the clear speech condition, the red lines represent cortical activity measured at right-hemisphere electrodes, and the blue lines represent activity from left-hemisphere electrodes. Ninety-five milliseconds of the prestimulus period is plotted. The speech envelope was shifted forward in time 85 ms to enable comparison to cortical responses; this time shift is for display purposes only. Right column, Cross-correlograms between clear speech envelope and individual subjects' cortical responses for each electrode pair. A small dot appears at the point chosen for subsequent stimulus-to-response correlation analyses.
Figure 2.
Average cross-correlogram peaks. Values represent the average peak lag and r value, collapsed across stimulus conditions, for each stimulus envelope–cortical response correlation at the three electrode pairs. Right-hemisphere electrodes are black, and left-hemisphere electrodes are gray. Error bars represent 1 SEM.
Figure 3.
Average RMS amplitudes for envelope-following and onset (inset) periods. The onset period was defined as 0–250 ms of the cortical response, and the envelope-following period was defined as 250–1500 (clear and conversational conditions) or 250–750 ms (compressed condition). Right-hemisphere electrodes are black, and left-hemisphere electrodes are gray. Error bars represent 1 SEM.
Figure 4.
Left-ear versus right-ear stimulation comparison: speech envelope phase-locking. Right-hemisphere electrodes are black, and left-hemisphere electrodes are gray. Error bars represent 1 SEM. LE Stim, Left-ear stimulation; RE Stim, right-ear stimulation.
Figure 5.
Left-ear versus right-ear stimulation comparison: RMS amplitude of the envelope-following period. Right-hemisphere electrodes are black, and left-hemisphere electrodes are gray. Error bars represent 1 SEM. Inset, RMS comparison of the onset period. LE Stim, Left-ear stimulation; RE Stim, right-ear stimulation.
References
- Beasley DS, Bratt GW, Rintelmann WF. Intelligibility of time-compressed sentential stimuli. J Speech Hear Res. 1980;23:722–731. -PubMed
- Belin P, Zilbovicius M, Crozier S, Thivard L, Fontaine A, Masure MC, Samson Y. Lateralization of speech and auditory temporal processing. J Cogn Neurosci. 1998;10:536–540. -PubMed
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources