Stanford typed dependencies are a widely desired representation of natural language sentences, but parsing is one of the major computational bottlenecks in text analysis systems. In light of the evolving definition of the Stanford dependencies and developments in statistical dependency parsing algorithms, this paper revisits the question of Cer et al. (2010): what is the tradeoff between accuracy and speed in obtaining Stanford dependencies in particular? We also explore the effects of input representations on this tradeoff: part-of-speech tags, the novel use of an alternative dependency representation as input, and distributional representaions of words. We find that direct dependency parsing is a more viable solution than it was found to ...
We present a gold standard annotation of syntactic dependencies in the English Web Treebank corpus u...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
Dependency parsing has been a prime focus of NLP research of late due to its ability to help parse l...
<p>Stanford typed dependencies are a widely desired representation of natural language sentences, bu...
This paper examines the Stanford typed dependencies representation, which was designed to provide a ...
The Stanford dependency scheme aims to provide a simple and intuitive but linguis-tically sound way ...
The Stanford dependency scheme aims to provide a simple and intuitive but linguistically sound way o...
After presenting a novel O(n^3) parsing algorithm for dependency grammar, we develop three contrasti...
We compare three different approaches to parsing into syntactic, bilexical dependencies for English:...
Dependency parsing has made many advancements in recent years, in particular for English. There are ...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
Bilexical dependencies capturing asymmetrical lexical relations between heads and dependents are vie...
We present a framework for text simplification based on applying transformation rules to a typed dep...
We present a framework for text simplification based on applying transformation rules to a typed dep...
We present a simple and effective semisupervised method for training dependency parsers. We focus on...
We present a gold standard annotation of syntactic dependencies in the English Web Treebank corpus u...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
Dependency parsing has been a prime focus of NLP research of late due to its ability to help parse l...
<p>Stanford typed dependencies are a widely desired representation of natural language sentences, bu...
This paper examines the Stanford typed dependencies representation, which was designed to provide a ...
The Stanford dependency scheme aims to provide a simple and intuitive but linguis-tically sound way ...
The Stanford dependency scheme aims to provide a simple and intuitive but linguistically sound way o...
After presenting a novel O(n^3) parsing algorithm for dependency grammar, we develop three contrasti...
We compare three different approaches to parsing into syntactic, bilexical dependencies for English:...
Dependency parsing has made many advancements in recent years, in particular for English. There are ...
The aim of this thesis is to improve Natural Language Dependency Parsing. We employ a linear Shift R...
Bilexical dependencies capturing asymmetrical lexical relations between heads and dependents are vie...
We present a framework for text simplification based on applying transformation rules to a typed dep...
We present a framework for text simplification based on applying transformation rules to a typed dep...
We present a simple and effective semisupervised method for training dependency parsers. We focus on...
We present a gold standard annotation of syntactic dependencies in the English Web Treebank corpus u...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
Dependency parsing has been a prime focus of NLP research of late due to its ability to help parse l...