Chang et al., Neuroimage speech verses nonspeech 2009.pdf (original) (raw)

Common neural substrates support speech and non-speech vocal tract gestures

NeuroImage, 2009

The issue of whether speech is supported by the same neural substrates as non-speech vocal tract gestures has been contentious. In this fMRI study we tested whether producing non-speech vocal tract gestures in humans shares the same functional neuroanatomy as non-sense speech syllables. Production of non-speech vocal tract gestures, devoid of phonological content but similar to speech in that they had familiar acoustic and somatosensory targets, was compared to the production of speech syllables without meaning. Brain activation related to overt production was captured with BOLD fMRI using a sparse sampling design for both conditions. Speech and non-speech were compared using voxel-wise whole brain analyses, and ROI analyses focused on frontal and temporoparietal structures previously reported to support speech production. Results showed substantial activation overlap between speech and non-speech function in regions. Although nonspeech gesture production showed greater extent and amplitude of activation in the regions examined, both speech and non-speech showed comparable left laterality in activation for both target perception and production. These findings posit a more general role of the previously proposed "auditory dorsal stream" in the left hemisphereto support the production of vocal tract gestures that are not limited to speech processing.

Phonatory and articulatory representations of speech production in cortical and subcortical fMRI responses

Scientific Reports

Speaking involves coordination of multiple neuromotor systems, including respiration, phonation and articulation. Developing non-invasive imaging methods to study how the brain controls these systems is critical for understanding the neurobiology of speech production. Recent models and animal research suggest that regions beyond the primary motor cortex (M1) help orchestrate the neuromotor control needed for speaking, including cortical and sub-cortical regions. Using contrasts between speech conditions with controlled respiratory behavior, this fMRI study investigates articulatory gestures involving the tongue, lips and velum (i.e., alveolars versus bilabials, and nasals versus orals), and phonatory gestures (i.e., voiced versus whispered speech). Multivariate pattern analysis (MVPA) was used to decode articulatory gestures in M1, cerebellum and basal ganglia. Furthermore, apart from confirming the role of a mid-M1 region for phonation, we found that a dorsal M1 region, linked to r...

Giving speech a hand: Gesture modulates activity in auditory cortex during speech perception

Human Brain Mapping, 2009

Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture—a fundamental type of hand gesture that marks speech prosody—might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions. Hum Brain Mapp, 2009. © 2008 Wiley-Liss, Inc.

Differential Representation of Articulatory Gestures and Phonemes in Precentral and Inferior Frontal Gyri

The Journal of Neuroscience, 2018

Speech is a critical form of human communication and is central to our daily lives. Yet, despite decades of study, an understanding of the fundamental neural control of speech production remains incomplete. Current theories model speech production as a hierarchy from sentences and phrases down to words, syllables, speech sounds (phonemes), and the actions of vocal tract articulators used to produce speech sounds (articulatory gestures). Here, we investigate the cortical representation of articulatory gestures and phonemes in ventral precentral and inferior frontal gyri in men and women. Our results indicate that ventral precentral cortex represents gestures to a greater extent than phonemes, while inferior frontal cortex represents both gestures and phonemes. These findings suggest that speech production shares a common cortical representation with that of other types of movement, such as arm and hand movements. This has important implications both for our understanding of speech pr...

The somatotopy of speech: Phonation and articulation in the human motor cortex

Brain and Cognition, 2009

A sizable literature on the neuroimaging of speech production has reliably shown activations in the orofacial region of the primary motor cortex. These activations have invariably been interpreted as reflecting ''mouth" functioning and thus articulation. We used functional magnetic resonance imaging to compare an overt speech task with tongue movement, lip movement, and vowel phonation. The results showed that the strongest motor activation for speech was the somatotopic larynx area of the motor cortex, thus reflecting the significant contribution of phonation to speech production. In order to analyze further the phonatory component of speech, we performed a voxel-based meta-analysis of neuroimaging studies of syllable-singing (11 studies) and compared the results with a previously-published meta-analysis of oral reading (11 studies), showing again a strong overlap in the larynx motor area. Overall, these findings highlight the under-recognized presence of phonation in imaging studies of speech production, and support the role of the larynx motor cortex in mediating the ''melodicity" of speech.

Modulation of Frontal Lobe Speech Areas Associated With the Production and Perception of Speech Movements

Journal of Speech, Language, and Hearing Research, 2009

Purpose It is unclear whether the production and perception of speech movements are subserved by the same brain networks. The purpose of this study was to investigate neural recruitment in cortical areas commonly associated with speech production during the production and visual perception of speech. Method This study utilized functional magnetic resonance imaging (fMRI) to assess brain function while participants either imitated or observed speech movements. Results A common neural network was recruited by both tasks. The greatest frontal lobe activity in Broca’s area was triggered not only when producing speech but also when watching speech movements. Relatively less activity was observed in the left anterior insula during both tasks. Conclusion These results support the emerging view that cortical areas involved in the execution of speech movements are also recruited in the perception of the same movements in other speakers.

Left dorsal speech stream components and their contribution to phonological processing

The Journal of neuroscience : the official journal of the Society for Neuroscience, 2015

Models propose an auditory-motor mapping via a left-hemispheric dorsal speech-processing stream, yet its detailed contributions to speech perception and production are unclear. Using fMRI-navigated repetitive transcranial magnetic stimulation (rTMS), we virtually lesioned left dorsal stream components in healthy human subjects and probed the consequences on speech-related facilitation of articulatory motor cortex (M1) excitability, as indexed by increases in motor-evoked potential (MEP) amplitude of a lip muscle, and on speech processing performance in phonological tests. Speech-related MEP facilitation was disrupted by rTMS of the posterior superior temporal sulcus (pSTS), the sylvian parieto-temporal region (SPT), and by double-knock-out but not individual lesioning of pars opercularis of the inferior frontal gyrus (pIFG) and the dorsal premotor cortex (dPMC), and not by rTMS of the ventral speech-processing stream or an occipital control site. RTMS of the dorsal stream but not of...

Neural correlates of the processing of co-speech gestures

NeuroImage, 2008

In communicative situations, speech is often accompanied by gestures. For example, speakers tend to illustrate certain contents of speech by means of iconic gestures which are hand movements that bear a formal relationship to the contents of speech. The meaning of an iconic gesture is determined both by its form as well as the speech context in which it is performed. Thus, gesture and speech interact in comprehension. Using fMRI, the present study investigated what brain areas are involved in this interaction process. Participants watched videos in which sentences containing an ambiguous word (e.g. She touched the mouse) were accompanied by either a meaningless grooming movement, a gesture supporting the more frequent dominant meaning (e.g. animal) or a gesture supporting the less frequent subordinate meaning (e.g. computer device). We hypothesized that brain areas involved in the interaction of gesture and speech would show greater activation to gesture-supported sentences as compa...

Functional lateralization of speech production at primary motor cortex: a fMRI study

Neuroreport, 1996

To evaluate lateralization of speech production at the level of the Rolandic cortex, functional magnetic resonance imaging (1.5 Tesla, 27 parallel axial slices, EPItechnique) was performed during a speech task (continuous silent recitation of the names of the months of the year). As control conditions, non-speech tongue movements and silent singing of a well-known melody with the syllable 'Ia' as its carrier were considered. Tongue movements produced symmetrical activation at the lower primary motor cortex. During automatic speech a strong functionallateralization to the left hemisphere emerged within the same area. In contrast, singing yielded a predominant right-sided activation of the Rolandic region. Functional lateralization of speech production therefore seems to include the precentral gyrus as well as Broca's area.

Cerebral areas associated with motor control of speech in humans.

Journal of Applied Physiology, 1997

We have defined areas in the brain activated during speaking, utilizing positron emission tomography. Six normal subjects continuously repeated the phrase "Buy Bobby a poppy" (requiring minimal language processing) in four ways: A) spoken aloud, B) mouthed silently, C) without articulation, and D) thought silently. Statistical comparison of images from conditions A with C and B with D highlighted areas associated with articulation alone, because control of breathing for speech was controlled for; we found bilateral activations in sensorimotor cortex and cerebellum with right-sided activation in the thalamus/caudate nucleus. Contrasting images from conditions A with B and C with D highlighted areas associated with the control of breathing for speech, vocalization, and hearing, because articulation was controlled for; we found bilateral activations in sensorimotor and motor cortex, close to but distinct from the activations in the preceding contrast, together with activations in thalamus, cerebellum, and supplementary motor area. In neither subtraction was there activation in Broca's area. These results emphasize the bilaterality of the cerebral control of "speaking" without language processing