Universal Grammar Is a universal grammar (original) (raw)

A conceptual escape from the perils of universal grammar

2018

Noam Chomsky's notion of an innate universal grammar (see e.g. Hauser & Chomsky, 2002; Chomsky, 1976) is known to be problematic, not least for identifying recursion as the defining aspect underlying human language, given the extremely limited use of recursion in known languages and ongoing claims (see e.g. Evans & Levinson, 2009) that some languages lack recursive capacity altogether. Far more serious, perhaps, is the sense that universal grammar is delivered by a priori fiat and empirical evidence is all but completely lacking. As Evans and Levinson argue, few aspects of language, indeed, are, on closer examination, in any meaningful sense universal, which might seem to point against universal biological foundations. Although a couple decades ago Chomsky's views on language still dominated the field, recent years have seen such notions as universal grammar being increasingly called into question. Whether taking the form of a universal grammar or not, innate linguistic capacity has another consequence that might seem undesirable: by suggesting that syntactically and grammatically structured language is uniquely human, it risks creating an absolute divide between human beings and other terrestrial species-a number of which, as the burgeoning field of comparative cognition has revealed over the past couple decades, show complex cognitive abilities, including abilities, such as spontaneous tool use (e.g.,

Grammar is NOT a Computer of the Human Mind/Brain

Journal of Cognition and Neuroethics, 2018

This paper will attempt to debunk the idea that human language grammar as part of the Faculty of Language (FoL) is intrinsically a computing device. The central argument here is that grammar does not compute. One way of demonstrating this is to show that the operations of grammar in the Generative model do not have the character typical of computations. Thus, the central operation of grammar Merge, which combines lexical items to produce larger expressions, can be defined as a recursive function, but it does not share the inductive properties of recursive functions in mathematics in view of the consideration that recursive functions define computability. On the other hand, if the language faculty is a computing system, the language faculty must inherit the halting problem as well. It is easy to impose the halting problem on the selection of lexical items from the lexicon in such a manner that FoL may or may not terminate over the selection of lexical items. We can say: there is no FoL way of telling if FoL will ever terminate on x or not when x is a selection from the lexicon. The halting problem for FoL is disastrous for the view that grammar is a computing system of the brain/mind since it detracts from the deterministic character of FoL. This has significant repercussions not just for grammar that cannot be restricted to any limited view of mental computation but also for the nature of the cognitive system as a whole since any cognitive domain that is (supposed to be) language-like cannot be said to compute as well.

Cognition, universal grammar, and typological generalizations

Lingua, 2013

We consider here two potential arguments for Universal Grammar other than that based on poverty of the stimulus. One stems from the limited number of notions that are grammatically encoded in the languages of the world. The other rests on the fact that of all mathematically possible orders of constituents only a subset is actually attested. Neither limitation appears to follow naturally from cognitive, historical, cultural, processing, or other factors; which makes it plausible to think of them as forced upon us by Universal Grammar, perhaps as a consequence of how it crystallized at some distant point of the evolution of our species.

Cognition, typological generalizations, and Universal Grammar 1

We consider here two potential arguments for Universal Grammar other than that based on poverty of the stimulus. One stems from the limited number of notions that are grammatically encoded in the languages of the world. The other rests on the fact that of all mathematically possible orders of constituents only a subset is actually attested. Neither limitation appears to follow naturally from cognitive, historical, cultural, processing, or other factors; which makes it plausible to think of them as forced upon us by Universal Grammar, perhaps as a consequence of how it crystallized at some distant point of the evolution of our species.

A Solution to the Logical Problem of Language Evolution: Language as an Adaptation to the Human Brain

2012

Chomsky's (eg 1965, 1980) proposal that human language is underpinned by a genetically specified universal grammar (UG) is astonishingly bold. Generative grammar is seen not primarily as concerned with finding the most elegant account of the linguistic patterns observed in the world's languages, but rather as a part of biology: it is viewed as specifying the structure of a ���language organ��� whose development is unfolds under genetic control.

The Nature of Grammar, its Role in Language and its Evolutionary Origins (dissertation)

PhD Dissertation, University of East London, 2007

Grammar is more than just order and hierarchy; it is a way of expressing complex multidimensional schemas in one dimension. The need to communicate these schemas is the concern of language, but how they are communicated is the concern of grammar. Because grammar does not necessarily rely on the preexistence of language, it is possible for the elements of grammar to be prototyped as features of other mental systems before language appears. These elements can then be exapted as needed for language. So the genesis of language and the genesis of grammar do not necessarily need to be considered as a single process.

On the Nature of Syntax

2008

There is a tendency in science to proceed from descriptive methods towards an adequate explanatory theory and then move beyond its conclusions. Our purpose is to discover the concepts of computational efficiency in natural language that exclude redundancy, and to investigate how these relate to more general principles. By developing the idea that linguistic structures possess the features of other biological systems this article focuses on the third factor that enters into the growth of language in the individual. It is suggested that the core principles of grammar can be observed in nature itself. The Faculty of Language is an efficient mechanism designed for the continuation of movement in compliance with optimization requirements. To illustrate that, a functional explanation of syntactic Merge is offered in this work, and an attempt is made to identify some criteria that single out this particular computational system as species-specific.

Universal grammar and semiotic constraints

2003

It has become an unquestioned dictum in modern linguistics that all human languages share a core set of common grammatical principles: a Universal Grammar (UG). What is to be included among these universals is not universally agreed upon, nor are the elements all of the same type (e.g. some candidate universals are rule-like, some are constraint-like, and some are structural), nor is there agreement on the source of these universals (e.g. nature/nurture). Over time, there has also been some erosion of the features once considered categorically universal and the expansion of features that are considered the variable expression of universal biases and constraints. For the most part, theories of language structure, language processing, and language origins all take many of the most common regularities of language as givens. But the universality of words, the major constituent class distinctions, the complementarity of noun-like and predicate-like constituents in the formation of grammatical sentences, and the ubiquity of recursive relationships, among many other basic universals, cannot be taken as self-evident axioms. The existence of even vaguely analogous counterparts in non-human communication and cognition is questionable, and even theories that assume them to be innately pre-specified must ultimately face the question of why such a system evolved this way and not in some other way. Finding the proper joints at which to cut language into its universal and variable components and understanding why there are language universals in the first place, are key mysteries of human cognition. But in one sense they are not linguistic questions. To probe them we must ultimately consider infra-linguistic factors: the semiotic, functional, neurological, and evolutionary constraints and biases that may have played a role. By far the most influential proposal concerning the origins of language universals derives from arguments originally put forth by Noam Chomsky. In a series of progressively refined theoretical positions, Chomsky and his followers have argued that the core language universals derive from an innate language-specific 'mental faculty' (e.g. Chomsky 1968; 1980; 1994; Pinker 1994). Human language competence is, in this view, a set of biologically inherited language principles that specify possible grammars and their variable components. This complex claim has been the focus of one the most heated debates in the history of the study of language. Although most linguists agree that there are universals, there is considerable disagreement concerning their origin: whether they are innate and biologically evolved or else culturally constructed conventions that must be learned. These two sources have been treated as though they are exhaustive alternatives. Whatever the ultimate answer turns out to be, it is assumed by proponents on both sides of the debate that it will need to be stated in terms of these two options, or some combination.

Universal Grammar: Arguments for its Existence

Advances in Language and Literary Studies, 2021

The first part of this paper discusses the rationale for universal grammar (UG) theory to explain first language acquisition. It also illustrates the issues of language acquisition Chomsky argued which could not be supported by behaviourist theories and shows how Chomsky proposed a solution to this problem through his theoretical model of universal grammar. The next part outlines this theory’s key tenets, arguing that these principles must be an innate endowment of the human mind. Moreover, the study illustrates specific examples of grammatical phenomena that universal grammar seeks to explain. Lastly, it shows that certain distinct grammatical features are linked and that these connections can be explained within the Universal Grammar theoretical framework. The only reasonable explanation for the first language learning needs only limited linguistic exposure to activate them and set criteria for the language being learned for children whose minds have already been wired with essent...