Motor speech perception modulates the cortical language areas (original) (raw)
Related papers
The contribution of the frontal lobe to the perception of speech
Journal of Neurolinguistics, 2010
Classical models of language claim a clear-cut distinction between language production and perception, indicating for them a different localization in the brain, and limiting the involvement of the frontal lobe exclusively in motor functions. In this review we present empirical evidence pointing to a weaker separation between sensory and motor functions, showing that the motor system plays an important role also in perception. In particular, very recent neurophysiological literature shows that a selective alteration of neural activity in speech motor centers alters speech perception. This result not only confirms that the classical sensory versus motor separation has to be abandoned, but underlines the causal contribution of the frontal lobe to the perception of speech.
Journal of Speech, Language, and Hearing Research, 2009
Purpose It is unclear whether the production and perception of speech movements are subserved by the same brain networks. The purpose of this study was to investigate neural recruitment in cortical areas commonly associated with speech production during the production and visual perception of speech. Method This study utilized functional magnetic resonance imaging (fMRI) to assess brain function while participants either imitated or observed speech movements. Results A common neural network was recruited by both tasks. The greatest frontal lobe activity in Broca’s area was triggered not only when producing speech but also when watching speech movements. Relatively less activity was observed in the left anterior insula during both tasks. Conclusion These results support the emerging view that cortical areas involved in the execution of speech movements are also recruited in the perception of the same movements in other speakers.
Journal of …, 2002
In this study we will give an overview of the experimental work on the neuroanatomical correlates of language and speech production that we have done in recent years. First we will introduce the methodology of event-related functional magnetic neuro-imaging and the experimental paradigm that we employed. Then we will present and discuss the results of our experiments on (1) speech motor control, (2) articulatory complexity, (3) the neuroanatomical correlates of prosody, and (4) the neurocognitive substrates of syntactic processing. Experiments and show that the expected large motor speech network consisting of SMA, motor cortex and cerebellum is only active in planning and execution of simple articulatory movements. Increased articulatory complexity leads to more focused activation. Furthermore, we can show that only the execution of speech movements recruits the left anterior insula, while articulatory planning does not. The results of experiment (3) indicate that it is not the function of prosody (linguistic vs affective) that controls lateralization of prosodic processing, but that more general characteristics of the processing units like the size of the prosodic frame are responsible for the activation of different cortical regions. Finally, in experiment (4) we present ®rst results on syntactic processing in speech production. Besides the expected activation of Broca's area we found activations in Wernicke's area and in the cerebellum. We have also found evidence for activations in other cortical areas, which are less often implicated in clinical studies on brain language correlations. The cognitive relevance of these areas and networks is still to be elucidated. q
Seeing the articulatory gestures of the speaker (''speech reading'') enhances speech perception especially in noisy conditions. Recent neuroimaging studies tentatively suggest that speech reading activates speech motor system, which then influences superior-posterior temporal lobe auditory areas via an efference copy. Here, nineteen healthy volunteers were presented with silent videoclips of a person articulating Finnish vowels /a/, /i/ (non-targets), and /o/ (targets) during event-related functional magnetic resonance imaging (fMRI). Speech reading significantly activated visual cortex, posterior fusiform gyrus (pFG), posterior superior temporal gyrus and sulcus (pSTG/S), and the speech motor areas, including premotor cortex, parts of the inferior (IFG) and middle (MFG) frontal gyri extending into frontal polar (FP) structures, somatosensory areas, and supramarginal gyrus (SMG). Structural equation modelling (SEM) of these data suggested that information flows first from extrastriate visual cortex to pFS, and from there, in parallel, to pSTG/S and MFG/FP. From pSTG/S information flow continues to IFG or SMG and eventually somatosensory areas. Feedback connectivity was estimated to run from MFG/FP to IFG, and pSTG/S. The direct functional connection from pFG to MFG/FP and feedback connection from MFG/FP to pSTG/S and IFG support the hypothesis of prefrontal speech motor areas influencing auditory speech processing in pSTG/S via an efference copy.
Cerebral areas associated with motor control of speech in humans.
Journal of Applied Physiology, 1997
We have defined areas in the brain activated during speaking, utilizing positron emission tomography. Six normal subjects continuously repeated the phrase "Buy Bobby a poppy" (requiring minimal language processing) in four ways: A) spoken aloud, B) mouthed silently, C) without articulation, and D) thought silently. Statistical comparison of images from conditions A with C and B with D highlighted areas associated with articulation alone, because control of breathing for speech was controlled for; we found bilateral activations in sensorimotor cortex and cerebellum with right-sided activation in the thalamus/caudate nucleus. Contrasting images from conditions A with B and C with D highlighted areas associated with the control of breathing for speech, vocalization, and hearing, because articulation was controlled for; we found bilateral activations in sensorimotor and motor cortex, close to but distinct from the activations in the preceding contrast, together with activations in thalamus, cerebellum, and supplementary motor area. In neither subtraction was there activation in Broca's area. These results emphasize the bilaterality of the cerebral control of "speaking" without language processing
The contribution of the inferior parietal cortex to spoken language production
Brain and Language, 2012
This functional MRI study investigated the involvement of the left inferior parietal cortex (IPC) in spoken language production (Speech). Its role has been apparent in some studies but not others, and is not convincingly supported by clinical studies as they rarely include cases with lesions confined to the parietal lobe. We compared Speech with non-communicative repetitive tongue movements (Tongue). The data were analyzed with both univariate contrasts between conditions and probabilistic independent component analysis (ICA). The former indicated decreased activity of left IPC during Speech relative to Tongue. However, the ICA revealed a Speech component in which there was correlated activity between left IPC, frontal and temporal cortices known to be involved in language. Therefore, although net synaptic activity throughout the left IPC may not increase above baseline conditions during Speech, one or more local systems within this region are involved, evidenced by the correlated activity with other language regions.
fMRI reveals two distinct cerebral networks subserving speech motor control
Neurology, 2005
Background: There are few data on the cerebral organization of motor aspects of speech production and the pathomechanisms of dysarthric deficits subsequent to brain lesions and diseases. The authors used fMRI to further examine the neural basis of speech motor control. Methods and Results: In eight healthy volunteers, fMRI was performed during syllable repetitions synchronized to click trains (2 to 6 Hz; vs a passive listening task). Bilateral hemodynamic responses emerged at the level of the mesiofrontal and sensorimotor cortex, putamen/pallidum, thalamus, and cerebellum (two distinct activation spots at either side). In contrast, dorsolateral premotor cortex and anterior insula showed left-sided activation. Calculation of rate/response functions revealed a negative linear relationship between repetition frequency and blood oxygen level-dependent (BOLD) signal change within the striatum, whereas both cerebellar hemispheres exhibited a step-wise increase of activation at~3 Hz. Analysis of the temporal dynamics of the BOLD effect found the various cortical and subcortical brain regions engaged in speech motor control to be organized into two separate networks (medial and dorsolateral premotor cortex, anterior insula, and superior cerebellum vs sensorimotor cortex, basal ganglia, and inferior cerebellum). Conclusion: These data provide evidence for two levels of speech motor control bound, most presumably, to motor preparation and execution processes. They also help to explain clinical observations such as an unimpaired or even accelerated speaking rate in Parkinson disease and slowed speech tempo, which does not fall below a rate of~3 Hz, in cerebellar disorders.
Human Temporal Lobe Activation by Speech and Nonspeech Sounds
Cerebral Cortex, 2000
Functional organization of the lateral temporal cortex in humans is not well understood. We recorded blood oxygenation signals from the temporal lobes of normal volunteers using functional magnetic resonance imaging during stimulation with unstructured noise, frequency-modulated (FM) tones, reversed speech, pseudowords and words. For all conditions, subjects performed a materialnonspecific detection response when a train of stimuli began or ceased. Dorsal areas surrounding Heschl's gyrus bilaterally, particularly the planum temporale and dorsolateral superior temporal gyrus, were more strongly activated by FM tones than by noise, suggesting a role in processing simple temporally encoded auditory information. Distinct from these dorsolateral areas, regions centered in the superior temporal sulcus bilaterally were more activated by speech stimuli than by FM tones. Identical results were obtained in this region using words, pseudowords and reversed speech, suggesting that the speech-tones activation difference is due to acoustic rather than linguistic factors. In contrast, previous comparisons between word and nonword speech sounds showed left-lateralized activation differences in more ventral temporal and temporoparietal regions that are likely involved in processing lexical-semantic or syntactic information associated with words. The results indicate functional subdivision of the human lateral temporal cortex and provide a preliminary framework for understanding the cortical processing of speech sounds.
Chang et al., Neuroimage speech verses nonspeech 2009.pdf
The issue of whether speech is supported by the same neural substrates as non-speech vocal tract gestures has been contentious. In this fMRI study we tested whether producing non-speech vocal tract gestures in humans shares the same functional neuroanatomy as non-sense speech syllables. Production of non-speech vocal tract gestures, devoid of phonological content but similar to speech in that they had familiar acoustic and somatosensory targets, was compared to the production of speech syllables without meaning. Brain activation related to overt production was captured with BOLD fMRI using a sparse sampling design for both conditions. Speech and non-speech were compared using voxel-wise whole brain analyses, and ROI analyses focused on frontal and temporoparietal structures previously reported to support speech production. Results showed substantial activation overlap between speech and non-speech function in regions. Although nonspeech gesture production showed greater extent and amplitude of activation in the regions examined, both speech and non-speech showed comparable left laterality in activation for both target perception and production. These findings posit a more general role of the previously proposed "auditory dorsal stream" in the left hemisphereto support the production of vocal tract gestures that are not limited to speech processing.