Graph-based dependency parsers suffer from the sheer number of higher order edges they need to (a) score and (b) consider during opti-mization. Here we show that when working with LP relaxations, large fractions of these edges can be pruned before they are fully scored—without any loss of optimality guar-antees and, hence, accuracy. This is achieved by iteratively parsing with a subset of higher-order edges, adding higher-order edges that may improve the score of the current solu-tion, and adding higher-order edges that are implied by the current best first order edges. This amounts to delayed column and row gen-eration in the LP relaxation and is guaranteed to provide the optimal LP solution. For second order grandparent models, our method...
Arc-eager dependency parsers process sentences in a single left-to-right pass over the input and hav...
We present experiments with a dependency parsing model defined on rich factors. Our model represents...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
We turn the Eisner algorithm for parsing to projective dependency trees into a cubic-time algorithm ...
Most syntactic dependency parsing models may fall into one of two categories: transition- and graph-...
Parsers that parametrize over wider scopes are generally more accurate than edge-factored models. Fo...
Most existing graph-based parsing models rely on millions of hand-crafted features, which limits the...
Many NLP systems use dependency parsers as critical components. Jonit learn-ing parsers usually achi...
Abstract—The high-order graph-based dependency pars-ing model achieves state-of-the-art accuracy by ...
Automatic syntactic analysis of natural language is one of the fundamental problems in natural langu...
Feature computation and exhaustive search have significantly restricted the speed of graph-based dep...
Automatic syntactic analysis of natural language is one of the fundamental problems in natural langu...
This paper introduces a Maximum Entropy dependency parser based on an efficient k-best Maximum Spann...
Graph parsing is known to be computationally expensive. For this reason the construction of special-...
We present a new cubic-time algorithm to calculate the optimal next step in shift-reduce dependency ...
Arc-eager dependency parsers process sentences in a single left-to-right pass over the input and hav...
We present experiments with a dependency parsing model defined on rich factors. Our model represents...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
We turn the Eisner algorithm for parsing to projective dependency trees into a cubic-time algorithm ...
Most syntactic dependency parsing models may fall into one of two categories: transition- and graph-...
Parsers that parametrize over wider scopes are generally more accurate than edge-factored models. Fo...
Most existing graph-based parsing models rely on millions of hand-crafted features, which limits the...
Many NLP systems use dependency parsers as critical components. Jonit learn-ing parsers usually achi...
Abstract—The high-order graph-based dependency pars-ing model achieves state-of-the-art accuracy by ...
Automatic syntactic analysis of natural language is one of the fundamental problems in natural langu...
Feature computation and exhaustive search have significantly restricted the speed of graph-based dep...
Automatic syntactic analysis of natural language is one of the fundamental problems in natural langu...
This paper introduces a Maximum Entropy dependency parser based on an efficient k-best Maximum Spann...
Graph parsing is known to be computationally expensive. For this reason the construction of special-...
We present a new cubic-time algorithm to calculate the optimal next step in shift-reduce dependency ...
Arc-eager dependency parsers process sentences in a single left-to-right pass over the input and hav...
We present experiments with a dependency parsing model defined on rich factors. Our model represents...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...