Disentangling Hierarchical and Sequential Computations during Sentence Processing (original) (raw)
Related papers
Topics in Cognitive Science, 2019
In many domains of human cognition, hierarchically structured representations are thought to play a key role. In this paper, we start with some foundational definitions of key phenomena like "sequence" and "hierarchy," and then outline potential signatures of hierarchical structure that can be observed in behavioral and neuroimaging data. Appropriate behavioral methods include classic ones from psycholinguistics along with some from the more recent artificial grammar learning and sentence processing literature. We then turn to neuroimaging evidence for hierarchical structure with a focus on the functional MRI literature. We conclude that, although a broad consensus exists about a role for a neural circuit incorporating the inferior frontal gyrus, the superior temporal sulcus, and the arcuate fasciculus, considerable uncertainty remains about the precise computational function(s) Correspondence should be sent to Julia Udd en,
Neural basis of processing sequential and hierarchical syntactic structures
Human Brain Mapping, 2007
The psychological processes through which humans learn a language have gained considerable interest over the past years. It has been previously suggested that language acquisition partly relies on a rule-based mechanism that is mediated by the frontal cortex. Interestingly, the actual structure involved within the frontal cortex varies with the kind of rules being processed. By means of functional MRI we investigated the neural underpinnings of rule-based language processing using an artificial language that allows direct comparisons between local phrase structure dependencies and hierarchically structured long-distance dependencies. Activation in the left ventral premotor cortex (PMC) was related to the local character of rule change, whereas long-distance dependencies activated the opercular part of the inferior frontal gyrus (Broca's area (BA) 44). These results suggest that the brain's involvement in syntactic processing is determined by the type of rule used, with BA 44/45 playing an important role during language processing when long-distance dependencies are processed. In contrast, the ventral PMC seems to subserve the processing of local dependencies. In addition, hippocampal activity was observed for local dependencies, indicating that the processing of such dependencies may be mediated by a second mechanism. Hum Brain Mapp 28: 585-592, 2007.
Hierarchical and sequential processing of language
Ding et al. (2017) contrast their view that language processing is based on hierarchical syntactic structures, to a view that relies on word-level input statistics. In this response to their paper, we clarify how, exactly, the two views differ (and how they do not), and make a case for the importance of sequential, as opposed to hierarchical, structure for language processing.
Neurophysiological dynamics of phrase-structure building during sentence processing
Although sentences unfold sequentially, one word at a time, most linguistic theories propose that their underlying syntactic structure involves a tree of nested phrases rather than a linear sequence of words. Whether and how the brain builds such structures, however, remains largely unknown. Here, we used human intracranial recordings and visual word-byword presentation of sentences and word lists to investigate how left-hemispheric brain activity varies during the formation of phrase structures. In a broad set of language-related areas, comprising multiple superior temporal and inferior frontal sites, high-gamma power increased with each successive word in a sentence but decreased suddenly whenever words could be merged into a phrase. Regression analyses showed that each additional word or multiword phrase contributed a similar amount of additional brain activity, providing evidence for a merge operation that applies equally to linguistic objects of arbitrary complexity. More superficial models of language, based solely on sequential transition probability over lexical and syntactic categories, only captured activity in the posterior middle temporal gyrus. Formal model comparison indicated that the model of multiword phrase construction provided a better fit than probability-based models at most sites in superior temporal and inferior frontal cortices. Activity in those regions was consistent with a neural implementation of a bottom-up or left-corner parser of the incoming language stream. Our results provide initial intracranial evidence for the neurophysiological reality of the merge operation postulated by linguists and suggest that the brain compresses syntactically well-formed sequences of words into a hierarchy of nested phrases. intracranial | merge | constituent | neurolinguistics | open nodes
Insensitivity of the Human Sentence-Processing System to Hierarchical Structure
Although it is generally accepted that hierarchical phrase structures are instrumental in describing human language, their cognitive status is still debated. We investigated the role of hierarchical structure in sentence processing by implementing a range of probabilistic language models, some of which depend on hierarchical structure whereas others use sequential structure only. All models estimated the occurrence probabilities of syntactic categories in sentences for which reading-time data was available. Relating the models’ probability estimates to the data showed that the hierarchical-structure models do not account for reading times over and above all of the sequential-structure models. This suggests that a sentence’s hierarchical structure, unlike many other sources of information, does not noticeably affect the generation of expectations about upcoming words.
Journal of cognitive …, 2006
& The present study investigated the processing of two types of artificial grammars by means of event-related brain potentials. Two categories of meaningless CV syllables were applied in each grammar type. The two grammars differed with regard to the type of the underlying rule. The finite-state grammar (FSG) followed the rule (AB) n , thereby generating local transitions between As and Bs (e.g., n = 2, ABAB). The phrase structure grammar (PSG) followed the rule A n B n , thereby generating center-embedded structures in which the first A and the last B embed the middle elements (e.g., n = 2, [A[AB]B]). Two sequence lengths (n = 2, n = 4) were used. Violations of the structures were introduced at different positions of the syllable sequences. Early violations were situated at the beginning of a sequence, and late violations were placed at the end of a sequence. A posteriorly distributed early negativity elicited by violations was present only in FSG. This effect was interpreted as the possible reflection of a violated local expectancy. Moreover, both grammar-type violations elicited a late positivity. This positivity varied as a function of the violation position in PSG, but not in FSG. These findings suggest that the late positivity could reflect difficulty of integration in PSG sequences. &
Brain and Language, 2003
The current research addresses the hypothesis that certain aspects of sequential cognition have made substantial contributions to the human language processing capability, from a functional neurophysiology perspective. We first describe a cognitive sequence processing model that was developed based on the functional neuroanatomy of primate cortex and basal ganglia. We demonstrate how this model is capable of simulating the behavior of human infants in extracting serial, temporal and abstract structure from language-like sound sequences as revealed in recent psycholinguistic experiments. We then demonstrate how, through training, this model can perform adult level syntactic comprehension, based on dissociated processing streams for open vs. closed class words. The model subsequently predicts: (1) that impaired syntactic processing (as in agrammatic aphasia) will be associated with impairments in corresponding non-linguistic cognitive sequencing tasks, and (2) that neurophysiological processes (as revealed by ERPs) involved in syntactic processing should also be involved in the corresponding non-linguistic cognitive sequencing tasks. Data confirming these predictions are reviewed. We conclude that the study of sequential cognition will provide a new paradigm for the investigation of the neurophysiological bases of language.
Abstract linguistic structure correlates with temporal activity during naturalistic comprehension
Brain and Language, 2016
Neurolinguistic accounts of sentence comprehension identify a network of relevant brain regions, but do not detail the information flowing through them. We investigate syntactic information. Does brain activity implicate a computation over hierarchical grammars or does it simply reflect linear order, as in a Markov chain? To address this question, we quantify the cognitive states implied by alternative parsing models. We compare processing-complexity predictions from these states against fMRI timecourses from regions that have been implicated in sentence comprehension. We find that hierarchical grammars independently predict timecourses from left anterior and posterior temporal lobe. Markov models are predictive in these regions and across a broader network that includes the inferior frontal gyrus. These results suggest that while linear effects are widespread across the language network, certain areas in the left temporal lobe deal with abstract, hierarchical syntactic representations.
2016
Structured sequence processing (SSP) refers to the neurocognitive mechanisms used to learn sequential patterns in the environment. SSP ability seems to be important for language (Conway, Bauernschmidt, Huang, & Pisoni, 2010); however, there are few neural studies showing an empirical connection between SSP and language. The purpose of this study was to investigate the association between SSP and language processing by comparing the underlying neural components elicited during each type of task. Healthy adult subjects completed a visual, non-linguistic SSP task incorporating an artificial grammar and a visual morphosyntactic language task. Both tasks were designed to cause violations in expectations of items occurring in a series. Event-related potentials (ERPs) were used to examine the underlying neural mechanisms associated with these expectancy violations. The results indicated the P3a component elicited by the SSP task and the P600 component elicited by the language task shared s...