The speaking brain: a tutorial introduction to fMRI experiments in the production of speech, prosody and syntax (original) (raw)
Related papers
Motor speech perception modulates the cortical language areas
NeuroImage, 2008
Traditionally, the left frontal and parietal lobes have been associated with language production while regions in the temporal lobe are seen as crucial for language comprehension. However, recent evidence suggests that the classical language areas constitute an integrated network where each area plays a crucial role both in speech production and perception. We used functional MRI to examine whether observing speech motor movements (without auditory speech) relative to non-speech motor movements preferentially activates the cortical speech areas. Furthermore, we tested whether the activation in these regions was modulated by task difficulty. This dissociates between areas that are actively involved with speech perception from regions that show an obligatory activation in response to speech movements (e.g. areas that automatically activate in preparation for a motoric response). Specifically, we hypothesized that regions involved with decoding oral speech would show increasing activation with increasing difficulty. We found that speech movements preferentially activate the frontal and temporal language areas. In contrast, non-speech movements preferentially activate the parietal region. Degraded speech stimuli increased both frontal and parietal lobe activity but did not differentially excite the temporal region. These findings suggest that the frontal language area plays a role in visual speech perception and highlight the differential roles of the classical speech and language areas in processing others' motor speech movements.
THE BRAIN BASIS OF LANGUAGE PROCESSING: FROM STRUCTURE TO FUNCTION
guage processing is a trait of human species. The knowledge about its neurobiological basis has been increased considerably over the past decades. Different brain regions in the left and right hemisphere have been identified to support particular language functions. Networks involving the temporal cortex and the inferior frontal cortex with a clear left lateralization were shown to support syntactic processes, whereas less lateralized temporo-frontal networks subserve semantic processes. These networks have been substantiated both by functional as well as by structural connectivity data. Electrophysiological measures indicate that within these networks syntactic processes of local structure building precede the assignment of grammatical and semantic relations in a sentence. Suprasegmental prosodic information overtly available in the acoustic language input is processed predominantly in a temporo-frontal network in the right hemisphere associated with a clear electrophysiological marker. Studies with patients suffering from lesions in the corpus callosum reveal that the posterior portion of this structure plays a crucial role in the interaction of syntactic and prosodic information during language processing.
Functional neuroimaging contributions to neurolinguistics
Journal of Neurolinguistics, 2003
What has functional neuroimaging contributed to linguistics in general and to neurolinguistics in particular? And what can one expect, within reason, that it might contribute in the future? These are the basic questions that the contributors to this special issue of the Journal of Neurolinguistics, editors included, undertook to address. Clearly, questions of this sort may be addressed in at least two very distinct ways: Directly, through lengthy commentaries on the literature and, indirectly, with minimal critical comment, and emphasis on the juxtaposition of specific studies representative of the extant literature. We have chosen the latter approach, which places the responsibility of appraisal squarely on the reader, who is the ultimate consumer of the literature, and we have reserved for ourselves the role of facilitators, a role that we believe we can fulfill in two ways. First, by soliciting a set of papers that represent fairly accurately the contemporary literature on applications of functional neuroimaging methods, to problems in neurolinguistics. Second, by bringing to the reader's attention technical issues, awareness of which is essential for an accurate evaluation of that literature. It appears, at least on the surface, that we have discharged our first responsibility satisfactorily: The sample of the works solicited for this special issue of this Journal, appears to be fairly representative of the caliber of publications one encounters in all professional journals. They are also representative in terms of the relative frequency of use of particular imaging methods (most of them involve functional magnetic resonance imaging or fMRI) and of geographic dispersion of the contributor's country of residence and work (seven countries and three continents are represented), in terms of genre (the papers include one case study, three critical reviews and seven reports of typical experiments) and in terms of specific topic (bilingual brain organization, verbal working memory, phonological and semantic processing, hemispheric specialization within prefrontal cortex, neural components of reading, developmental changes in lexical processing, and the localization of syntactic processing mechanisms). Clearly, a small sample of 11 papers cannot represent all trends, whether theoretical or technical, and all shades of opinion prevailing in the field, but we believe this sample provides a fairly accurate picture of the most prevailing trends and opinions. To discharge the second responsibility we have assumed as editors of this Special Issue, we are summarizing below some key concepts that the reader ought to keep in mind in
Dissociating linguistic and nonlinguistic gestural communication in the brain
Neuroimage, 2004
Gestures of the face, arms, and hands are components of signed languages used by Deaf people. Signaling codes, such as the racecourse betting code known as Tic Tac, are also made up of such gestures. Tic Tac lacks the phonological structure of British Sign Language (BSL) but is similar in terms of its visual and articulatory components. Using fMRI, we compared the neural correlates of viewing a gestural language (BSL) and a manual-brachial code (Tic Tac) relative to a lowlevel baseline task. We compared three groups: Deaf native signers, hearing native signers, and hearing nonsigners. None of the participants had any knowledge of Tic Tac. All three groups activated an extensive frontal-posterior network in response to both types of stimuli. Superior temporal cortex, including the planum temporale, was activated bilaterally in response to both types of gesture in all groups, irrespective of hearing status. The engagement of these traditionally auditory processing regions was greater in Deaf than hearing participants. These data suggest that the planum temporale may be responsive to visual movement in both deaf and hearing people, yet when hearing is absent early in development, the visual processing role of this region is enhanced. Greater activation for BSL than Tic Tac was observed in signers, but not in nonsigners, in the left posterior superior temporal sulcus and gyrus, extending into the supramarginal gyrus. This suggests that the left posterior perisylvian cortex is of fundamental importance to language processing, regardless of the modality in which it is conveyed. D
Brain-Language Research: Where is the Progress?
Biolinguistics
Recent cognitive neuroscience research improved our understanding of where, when, how, and why language circuits emerge and activate in the human brain. Where: Regions crucial for very specific linguistic processes were delineated; phonetic features and fine semantic categories could be mapped onto specific sets of cortical areas. When: Brain correlates of phonological, syntactic and semantic processes were documented early-on, suggesting language understanding in an instant (within 250ms). How: New mechanistic network models mimicking structure and function of left-perisylvian language areas suggest that multimodal action-perception circuits — rather than separate modules for action and perception — carry the processing resources for language use and understanding. Why language circuits emerge in specific areas, become active at specific early time points and are connected in specific ways is best addressed in light of neuroscience principles governing neuronal activation, correlat...
Rethinking the neurological basis of language
Lingua, 2005
Functional neuroimaging, within 10 years, has produced evidence which leads us to question a number of the standard assumptions about the areas which are necessary and sufficient for language processing. Although neuroimaging evidence has corroborated much neuropsychological data, it forces a revision of a number of the standard interpretations of those data and some traditionally accepted notions must be totally discarded. We will provide an overview of some issues which have arisen in these years, giving examples from a number of laboratories and illustrating with experiments of our own. The circumstances under which the left posterior temporal lobe (Wernicke's area) and the left inferior frontal gyrus (Broca's area) are activated are reviewed, and several views of how they contribute to language processing are considered in the light of this evidence. Further evidence for the contribution of a number of other areas to language comprehension are reviewed, including the anterior temporal lobe, the cerebellum, the left superior median frontal lobe, the anterior insula and the left inferior temporal occipital junction. Further we discuss some of the conditions under which the right hemisphere contributes to language processing. We will conclude by discussing the implications of this research for the concept of modularity in the sense of Fodor [Modularity of Mind, MIT Press, Cambridge, 1983].
Activation of language cortex with automatic speech tasks
Neurology, 2000
To identify automatic speech tasks that reliably demonstrate increased regional cerebral blood flow (rCBF) in Broca's and Wernicke's areas of the cortex using PET. Background: Localizing language with direct cortical stimulation mapping requires that patients have a stable baseline on tests that engage eloquent cortex. For dysphasic patients or younger children, automatic speech tasks such as counting are often used in lieu of more complex language tests. Evidence from both lesion and neuroimaging studies suggests that these tasks may not adequately engage language cortices. In this study, we examined rCBF during automatic oromotor and speech tasks of varying complexity to identify those eliciting increased CBF in Broca's and Wernicke's areas. Methods: Eight normal volunteers underwent PET during rest, tongue movements, and three automatic speech tasks: repeating a phoneme sequence, repeating the months of the year, and reciting a memorized prose passage. Images were averaged across subjects and compared across tasks for regional localization and laterality. Results: Whereas all activation tasks produced increased relative CBF in brain regions that correlated with articulation and auditory processing, only the two tasks that used real words (versus phonemes) showed left-lateralized rCBF increases in posterior superior temporal lobe (Wernicke's area), and only the prose repetition task produced left lateralized activity in Broca's area. Conclusions: Whereas automatic speech typically does not engage language cortex, repeating a memorized prose passage showed unambiguous activation in both Broca's and Wernicke's areas. These results caution against the use of common automatic speech tasks for mapping eloquent cortex and suggest an alternative task for those with poor language abilities or acquired dysphasia who cannot perform standardized language tests reliably.