Grammar, Gestures, and Meaning in American Sign Language (review) (original) (raw)
Related papers
Gesture in sign language discourse
Journal of Pragmatics, 1998
An analysis of spoken language discourse cannot be limited to merely the words used in the discourse and their grammatical arrangement. It is well known, for example, that speakers commonly gesture as they speak. These gestures often provide either independent information or important clues needed for a clear understanding of the message of the speaker. Pointing at a book when saying 'this book' identifies the book being discussed. Without the deictic gesture, the particular book being described might not be identifiable. Other gestures may provide information which is complementary to the spoken words. In either case, the addressee must combine the grammatically coded spoken information with the gestural information to produce a message which incorporates both.
The Syntax of American Sign Language: Functional Categories and Hierarchical Structure
2000
The linguistic study of sign languages attracts more and more the attention from researchers of general linguistics since analyses of the grammar, the acquisition, the production, and the processing of sign languages provide additional evidence for modality independent language universals. However, recent research in sign language linguistics still ranges between investigations into very basic empirical issues and highly elaborated analyses of phonological, morphological, or syntactic phenomena. This book definitely follows the second direction. It offers a detailed analysis of central aspects of the syntax of American Sign Language (ASL) within the framework of recent theories of generative grammar. The main focus of Neidle, Kegl, MacLaughlin, Bahan, and Lee (henceforth NKMBL) is on the syntactic function of nonmanual markers such as head position, facial expression, eyebrow movement, and eye gaze. The authors argue that these markers directly express abstract syntactic features located in functional heads. As a consequence, the hierarchical structure of functional projections can be derived from the distribution of nonmanual markers. Therefore, the analysis proposed in this book has an important impact on the design of modern generative theories, too.
Lingua, 2006
The linguistic study of sign languages attracts more and more the attention from researchers of general linguistics since analyses of the grammar, the acquisition, the production, and the processing of sign languages provide additional evidence for modality independent language universals. However, recent research in sign language linguistics still ranges between investigations into very basic empirical issues and highly elaborated analyses of phonological, morphological, or syntactic phenomena. This book definitely follows the second direction. It offers a detailed analysis of central aspects of the syntax of American Sign Language (ASL) within the framework of recent theories of generative grammar. The main focus of Neidle, Kegl, MacLaughlin, Bahan, and Lee (henceforth NKMBL) is on the syntactic function of nonmanual markers such as head position, facial expression, eyebrow movement, and eye gaze. The authors argue that these markers directly express abstract syntactic features located in functional heads. As a consequence, the hierarchical structure of functional projections can be derived from the distribution of nonmanual markers. Therefore, the analysis proposed in this book has an important impact on the design of modern generative theories, too.
One grammar or two? Sign Languages and the Nature of Human Language
Linguistic research has identified abstract properties that seem to be shared by all languages-such properties may be considered defining characteristics. In recent decades, the recognition that human language is found not only in the spoken modality but also in the form of sign languages has led to a reconsideration of some of these potential linguistic universals. In large part, the linguistic analysis of sign languages has led to the conclusion that universal characteristics of language can be stated at an abstract enough level to include languages in both spoken and signed modalities. For example, languages in both modalities display hierarchical structure at sub-lexical and phrasal level, and recursive rule application. However, this does not mean that modality-based differences between signed and spoken languages are trivial. In this article, we consider several candidate domains for modality effects, in light of the overarching question: are signed and spoken languages subject to the same abstract grammatical constraints, or is a substantially different conception of grammar needed for the sign language case? We look at differences between language types based on the use of space, iconicity, and the possibility for simultaneity in linguistic expression. The inclusion of sign languages does support some broadening of the conception of human language-in ways that are applicable for spoken languages as well. Still, the overall conclusion is that one grammar applies for human language, no matter the modality of expression.
Gesture, 2004
In this paper I explore the role of gesture in the development of signed languages. Using data from American Sign Language, Catalan Sign Language, French Sign Language, and Italian Sign Language, as well as historical sources describing gesture in the Mediterranean region, I demonstrate that gesture enters the linguistic system via two distinct routes. In one, gesture serves as a source of lexical and grammatical morphemes in signed languages. In the second, elements become directly incorporated into signed language morphology, bypassing the lexical stage. Finally, I propose a unifying framework for understanding the gesture-language interface in signed and spoken languages.