Application of feature propagation to dependency parsing (original) (raw)

A transition-based parser for 2-planar dependency structures

… of the 48th Annual Meeting of the …, 2010

Finding a class of structures that is rich enough for adequate linguistic representation yet restricted enough for efficient computational processing is an important problem for dependency parsing. In this paper, we present a transition system for 2-planar dependency trees -trees that can be decomposed into at most two planar graphs -and show that it can be used to implement a classifier-based parser that runs in linear time and outperforms a stateof-the-art transition-based parser on four data sets from the CoNLL-X shared task. In addition, we present an efficient method for determining whether an arbitrary tree is 2-planar and show that 99% or more of the trees in existing treebanks are 2-planar.

Dependency Parsing Schemata and Mildly Non-Projective Dependency Parsing

Computational Linguistics, 2006

We introduce dependency parsing schemata, a formal framework based on Sikkel's parsing schemata for constituency parsers, which can be used to describe, analyze, and compare dependency parsing algorithms. We use this framework to describe several well-known projective and non-projective dependency parsers, build correctness proofs, and establish formal relationships between them. We then use the framework to define new polynomial-time parsing algorithms for various mildly non-projective dependency formalisms, including well-nested structures with their gap degree bounded by a constant k in time O(n 5+2k ), and a new class that includes all gap degree k structures present in several natural language treebanks (which we call mildly ill-nested structures for gap degree k) in time O(n 4+3k ). Finally, we illustrate how the parsing schema framework can be applied to Link Grammar, a dependency-related formalism.

Dependency parsing with undirected graphs

2012

Abstract We introduce a new approach to transitionbased dependency parsing in which the parser does not directly construct a dependency structure, but rather an undirected graph, which is then converted into a directed dependency tree in a post-processing step. This alleviates error propagation, since undirected parsers do not need to observe the single-head constraint.

Treebank grammar techniques for non-projective dependency parsing

Proceedings of the 12th Conference of the …, 2009

An open problem in dependency parsing is the accurate and efficient treatment of non-projective structures. We propose to attack this problem using chart-parsing algorithms developed for mildly contextsensitive grammar formalisms. In this paper, we provide two key tools for this approach. First, we show how to reduce nonprojective dependency parsing to parsing with Linear Context-Free Rewriting Systems (LCFRS), by presenting a technique for extracting LCFRS from dependency treebanks. For efficient parsing, the extracted grammars need to be transformed in order to minimize the number of nonterminal symbols per production. Our second contribution is an algorithm that computes this transformation for a large, empirically relevant class of grammars.

A Transition-Based Dependency Parser Using a Dynamic Parsing Strategy

2013

We present a novel transition-based, greedy dependency parser which implements a flexible mix of bottom-up and top-down strategies. The new strategy allows the parser to postpone difficult decisions until the relevant information becomes available. The novel parser has a 12% error reduction in unlabeled attachment score over an arc-eager parser, with a slow-down factor of 2.8.