Type Logical Grammar: Categorial Logic of Signs (original) (raw)

Type Logical Grammar

1994

Look under the hood of most theories of grammar or computational linguistic formalisms and you will find a "machine," often fueled by "rules," that grinds together (descriptions of) linguistic objects to produce other (descriptions of) linguistic objects. Such machines are justified by their descriptive success, or by claims that they explain (aspects of) linguistic ability. However, the puzzle of the origins of the machine remains. In our present state of knowledge about language, proposed machines can be only indirectly justified. And even if our knowledge were sufficient for more direct justifications, the broader question still remains of what are the necessary properties of any sign system that associates an open-ended set of meanings with elements of a

Grammar and logical types

In 7th Amsterdam Colloquium

This paper represents categorial grammar as an implicational type theory in the spirit of Girard's linear logic, and illustrates linguistic applications of a range of type- constructors over and above implication. ...

Logical types and linguistic types

Tertium Non Datur, 1986

One of the primary aims of linguistic semantics is to translate the expressions of natural language into formulas of some logical calculus. These formulas, in turn, can be interpreted in the appropriate models, and semantic notions like truth, entailment, etc. can be formally defined in the usual manner. In addition to the well-known theoretical advantages of such an intermediate logical form, there is a practical advantage as well: given a system of rules for translation from formulas to natural language, it will be possible to translate from one natural language to another (via the interlingua) without actually evaluating the expressions of the source language. Although the calculi used as intermediate language in machine translation range from first-order predicate calculus (e.g. Schubert -Pelletier 1982) to the higher order intensional calculus of Montague Grammar (e.g. Landsbergen 1977), so far no type-free calculus has been employed for this purpose.

Towards a More Lexical and Functional Type-Logical Theory of Grammar

Type-Logical Lexical Functional Grammar is a new, radically lexicalist, and formally parsimonious theory, in essence a re-incarnation of Lexical Func- tional Grammar (Kaplan and Bresnan, 1982) in a type-logical formal frame- work very similar in formal nature to that of Type-Logical Categorial Gram- mar (Morrill, 1994; Moortgat, 1997). It puts emphasis on having a simple logical foundation as its formal basis and no empirically unmotivated primi- tives, representations, and mappings between them. It differs from TLCG in basing syntactic analyses on functional rather than constituent structure, to both LFG and TLCG in that it rejects syntactic categories as primitives, and to LFG in that it rejects c-structure as a linguistically significant representa- tion and in being radically lexicalist. The present paper presents TL-LFG, the sequence of developments that lead to it, and its key differences from LFG.

The syntax-semantics interface of ‘respective’ predication: a unified analysis in Hybrid Type-Logical Categorial Grammar

Natural Language & Linguistic Theory, 2015

This paper proposes a unified analysis of the 'respective' readings of plural and conjoined expressions, the internal readings of symmetrical predicates such as same and different, and the summative readings of expressions such as a total of $10,000. These expressions pose significant challenges to compositional semantics, and have been studied extensively in the literature. However, almost all previous studies focus exclusively on one of these phenomena, and the close parallels and interactions that they exhibit have been mostly overlooked to date. We point out two key properties common to these phenomena: (i) they target all types of coordination, including nonconstituent coordination such as Right-Node Raising and Dependent Cluster Coordination; (ii) the three phenomena all exhibit multiple dependency, both by themselves and with respect to each other. These two parallels suggest that one and the same mechanism is at the core of their semantics. Building on this intuition, we propose a unified analysis of these phenomena, in which the meanings of expressions involving coordination are formally modelled as multisets, that is, sets that allow for duplicate occurrences of identical elements. The analysis is couched in Hybrid Type-Logical Categorial Grammar. The flexible syntax-semantics interface of this framework enables an analysis of 'respective' readings and related phenomena which, for the first time in the literature, yields a simple and principled solution for both the interactions with nonconstituent coordination and the multiple dependency noted above.

Type Theory and Natural Language: Do We Need Two Basic Types? 1

2012

0. A universal, or almost universal distinction, in syntax: Sentence and NP.........................................................1 1. A possibly universal foundation for natural language semantics: types e and t..................................................1 2. Thought experiments: “Monocategoric”? and just one basic semantic type?....................................................3 3. Ingredients for a possible one-basic-type semantics..........................................................................................3 3.1. Neo-Davidsonian semantics of event sentences..........................................................................................3 3.2. Kamp-Heim semantics for indefinite NPs..................................................................................................3 3.3. Open formulas are “almost ” type-neutral....................................................................................................4 3.4. Exploit the similarit...

Type Construction and the Logic of Concepts

The Language of Word Meaning, 2001

I would like to pose a set of fundamental questions regarding the constraints we can place on the structure of our concepts, particularly as revealed through language. I will outline a methodology for the construction of ontological types based on the dual concerns of capturing linguistic generalizations and satisfying metaphysical considerations. I discuss what "kinds of things" there are, as reflected in the models of semantics we adopt for our linguistic theories. I argue that the flat and relatively homogeneous typing models coming out of classic Montague Grammar are grossly inadequate to the task of modelling and describing language and its meaning. I outline aspects of a semantic theory (Generative Lexicon) employing a ranking of types. I distinguish first between natural (simple) types and functional types, and then motivate the use of complex types (dot objects) to model objects with multiple and interdependent denotations. This approach will be called the Principle of Type Ordering. I will explore what the top lattice structures are within this model, and how these constructions relate to more classic issues in syntactic mapping from meaning.

Hybrid Type-Logical Grammars, First-Order Linear Logic and the Descriptive Inadequacy of Lambda Grammars

The goal of this paper is to prove that Hybrid type-logical grammars are a fragment of first-order linear logic. This embedding result has several important consequences: it not only provides a simple new proof theory for the calculus, thereby clarifying the proof-theoretic foundations of Hybrid type-logical gram- mars, but, since the translation is simple and direct, it also provides several new parsing strategies for Hybrid type-logical grammars. Second, NP-completeness of Hybrid type-logical grammars follows immediately. The main embedding result also sheds new light on the problems with lambda grammars/abstract categorial grammars, which are a subsystem of Hybrid type-logical grammars and hence a special case of the translation into first-order linear logic.