Why are computational models of text comprehension useful (original) (raw)
Related papers
A Computational Model for Simulating Text Comprehension
2008
This paper describes the architecture of a computer program which aims at simulating the process by which humans comprehend texts, that is, construct a coherent representation of the meaning of the text through processing in turn all sentences. This program is based on psycholinguistic theories about human memory and text comprehension processes, namely the construction-integration model , the Latent Semantic Analysis theory of knowledge representation and the predication algorithms . It is intended to help psycholinguists investigate the way humans comprehend texts.
Contemporary Educational Psychology, 1997
The ability to read and comprehend text is crucial for success in our society and its development has been a main component of instructional practice. In the past 2 decades, psychologists have devoted a good deal of attention to the question of how competent, adult readers comprehend text. Influenced by work in linguistics and artificial intelligence, the efforts of these cognitive scientists have dramatically increased our understanding of the psychological mechanisms underlying reading comprehension. In this article, we provide an overview of the contributions of cognitive research on text comprehension and an agenda for future research. Our specific interest is in understanding the processes by which skilled adult readers comprehend text. We first sketch the historical progression of experimental research on text processing and then present more detailed analyses of the major contributions of cognitive science. The emphasis is on the theoretical developments that have played central roles in advancing our understanding of text comprehension. Our premise is that by identifying how progress has been achieved in understanding central aspects of text comprehension, we will be in better position to suggest how research might best proceed in domains where less progress has occurred. Finally, we identify what we consider the most pressing topics for future research and suggest the kinds of theoretical developments that appear necessary to advance research in those domains. We thank Jose Leon and Mike Royer for their helpful comments on the initial draft of the manuscript.
Understanding Reading Comprehension: Current and Future Contributions of Cognitive Science
Contemporary Educational Psychology, 1997
The ability to read and comprehend text is crucial for success in our society and its development has been a main component of instructional practice. In the past 2 decades, psychologists have devoted a good deal of attention to the question of how competent, adult readers comprehend text. Influenced by work in linguistics and artificial intelligence, the efforts of these cognitive scientists have dramatically increased our understanding of the psychological mechanisms underlying reading comprehension. In this article, we provide an overview of the contributions of cognitive research on text comprehension and an agenda for future research. Our specific interest is in understanding the processes by which skilled adult readers comprehend text.
Text comprehension as a problem solving situation
Universitas Psychologica
Research in text comprehension has provided details as to how text features and cognitive processes interact in order to build comprehension and generate meaning. However, there is no explicit link between the cognitive processes deployed during text comprehension and their place in higher-order cognition, as in problem solving. The purpose of this paper is to propose a cognitive model in which text comprehension is made analogous to a problem solving situation and that relies on current research on well-known cognitive processes such as inference generation, memory, and simulations. The key characteristic of the model is that it explicitly includes the formulation of questions as a component that boosts representational power. Other characteristics of the model are specified and its extensions to basic and applied research in text comprehension and higher-order cognitive processes are outlined.
Psychological and Computational Models of Language Comprehension
2011
In this paper, I argue for a modifi ed version of what Devitt (2006) calls the Representational Thesis (RT). According to RT, syntactic rules or principles are psychologically real, in the sense that they are represented in the mind/brain of every linguistically competent speaker/hearer. I present a range of behavioral and neurophysiological evidence for the claim that the human sentence processing mechanism constructs mental representations of the syntactic properties of linguistic stimuli. I then survey a range of psychologically plausible computational models of comprehension and show that they are all committed to RT. I go on to sketch a framework for thinking about the nature of the representations involved in sentence processing. My claim is that these are best characterized not as propositional attitudes but, rather, as subpersonal states. Moreover, the representational properties of these states are determined by their functional role, not solely by their causal or nomological relations to mind-independent objects and properties. Finally, I distinguish between explicit and implicit representations and argue, contra Devitt (2006), that the latter can be drawn on "as data" by the algorithms that constitute our sentence processing routines. I conclude that Devitt's skepticism concerning the psychological reality of grammars cannot be sustained.
Toward a Model of Text Comprehension and Production
The semantic structure of texts can be described both at the local microlevel and at a more global macrolevel. A model for text comprehension based on this notion accounts for the formation of a coherent semantic text base in terms of a cyclical process constrained by limitations of working memory. Furthermore, the model includes macro-operators, whose purpose is to reduce the information in a text base to its gist, that is, the theoretical macrostructure. These operations are under the control of a schema, which is a theoretical formulation of the comprehender's goals. The macroprocesses are predictable only when the control schema can be made explicit. On the production side, the model is concerned with the generation of recall and summarization protocols. This process is partly reproductive and partly constructive, involving the inverse operation of the macro-operators. The model is applied to a paragraph from a psychological research report, and methods for the empirical testing of the model are developed. The main goal of this article is to describe the system of mental operations that underlie the processes occurring in text comprehension and in the production of recall and summariza-tion protocols. A processing model will be outlined that specifies three sets of operations. First, the meaning elements of a text become organized into a coherent whole, a process that results in multiple processing of some elements and, hence, in differential retention. A second set of operations condenses the full meaning of the text into its gist. These processes are complemented by a third set of operations that generate new texts from the memorial consequences of the comprehension processes. These goals involve a number of more concrete objectives. We want first to be able to go through a text, sentence by sentence, specifying the processes that these sentences undergo in comprehension as well as the outputs of these processes at various stages of comprehension. Next, we propose to analyze recall protocols and summaries in the same way and to specify for each sentence the operations required to produce such a sentence. The
Models of reading comprehension and their related
This paper discusses theoretical models of reading comprehension, the pedagogical practices related to them, and the empirical evidence available in connection with each model-pedagogy pair in order to evaluate the models and practices and make informed recommendations for EFL reading teachers. According to Grabe (2009) and Han and D'Angelo (2007), current-traditional EFL reading pedagogy and teacher training are based on a top-down model of the reading process, Goodman's (1967) psycholinguistic guessing game (PGG) and its pedagogical offshoot, the Whole Language approach. The pedagogical practices of the Whole Language approach emphasize building background knowledge and vocabulary and de-emphasize deep engagement with discourse. Based on our review of the research, we argue that PGG and the Whole Language approach are not supported by empirical research. We then review the literature to a) show that empirical research supports interactive models where both top-down and bottom-up processes are involved (Kintsch, 1998; Stanovich, 2000; Bernhardt, 2011), b) show that good L2 readers derive meaning primarily from deep processing of a text's language (a bottom-up process) rather than from the application of prior knowledge or information guessing techniques (Alptekin, 2004; McNeill, 2011; Perales, 2011). We then argue that EFL pedagogy and teacher training need to turn away from the Whole Language model and be more informed by traditions emphasizing bottom-up processing of texts. We also make a theoretical proposal regarding the placement of discourse-processing strategies in current models of L2 comprehension.
A Critical Overview of Models of Reading Comprehension with a Focus on Cognitive Aspects
Iranian Journal of Research in English Language Teaching, 2014
Reading is a cognitive activity involving skills, strategies, attentional resources, knowledge resources and their integration. The reader's role is to decode the written symbols to allow for the recovery of information from long-term memory to construct a plausible interpretation of the writer's message. Various number of reading models have been proposed by researchers among which some focus on motivational and emotional aspects of reading. Others highlight the cognitive aspects of reading. In this study, the models characterizing reading in terms of cognitive aspects are revieweded, and different viewpoints on the reading process are described. This may help EFL/ESL teachers to improve their understanding of the reading process, update their perspectives on teaching reading tasks which in turn might result in more efficient learning by not putting too much cognitively demanding reading tasks on EFL/ESL learners.
On the theory and measurement of reading comprehension
Contemporary Educational Psychology, 1981
Whenever we have had what we thought was a good idea about reading comprehension, we've generally found that John Carroll has had it first. He may not agree with what we have said, but he has influenced our thinking enormously. We would like to thank
Word Knowledge in a Theory of Reading Comprehension
Scientific Studies of Reading, 2013
We reintroduce a wide-angle view of reading comprehension, the Reading Systems Framework, which places word knowledge in the center of the picture, taking into account the progress made in comprehension research and theory. Within this framework, word-to-text integration processes can serve as a model for the study of local comprehension processes, that is, those that make sense out of short stretches of text. These processes require linkage between the word identification system and the comprehension system, with the lexicon in the linking role. Studies of these processes examine the influence of one sentence on the reading of a single word in a second sentence, which enables the integration of the word meaning into the reader's mental model of the text. Skilled comprehenders, more than less skilled, show immediate use of word meanings in the integration process. Other evidence is also consistent with the assumption that word meaning processes are causal components in comprehension skill.