A new almost-parsing language model incorporating multiple knowledge sources that is based upon the concept of Constraint Dependency Grammars is presented in this paper. Lexical features and syntactic constraints are tightly integrated into a uniform linguistic structure called a SuperARV that is associated with a word in the lexicon. The SuperARV language model reduces perplexity and word error rate compared to trigram, part-of-speech-based, and parser-based language models. The relative contributions of the various knowledge sources to the strength of our model are also investigated by using constraint relaxation at the level of the knowledge sources. We have found that although each knowledge source contributes to language model quality,...
This technical report concerns the development of a probabilistic Constraint Dependency Grammar (CDG...
This study shows that using computational linguistic models is beneficial for descriptive linguistic...
In this paper we propose a new approach to language modeling based on dynamic Bayesian networks. The...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
A new language model is presented which incorporates local N-gram dependencies with two important so...
We present a novel, structured language model- Supertagged Dependency Language Model to model the sy...
Statistical approaches to language learning typically focus on either short-range syntactic dependen...
In this dissertation, we have proposed novel methods for robust parsing that integrate the flexibili...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
. A grammar model for concurrent, object-oriented natural language parsing is introduced. Complete l...
After presenting a novel O(n^3) parsing algorithm for dependency grammar, we develop three contrasti...
Language models are an important component of speech recognition. They aim to predict the probabilit...
Today, the top performing parsing algorithms rely on the availability of annotated data for learning...
Language models (LMs) are essential components of many applications such as speech recognition or ma...
Natural language processing requires flexible control of computation on various sorts of constraints...
This technical report concerns the development of a probabilistic Constraint Dependency Grammar (CDG...
This study shows that using computational linguistic models is beneficial for descriptive linguistic...
In this paper we propose a new approach to language modeling based on dynamic Bayesian networks. The...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
A new language model is presented which incorporates local N-gram dependencies with two important so...
We present a novel, structured language model- Supertagged Dependency Language Model to model the sy...
Statistical approaches to language learning typically focus on either short-range syntactic dependen...
In this dissertation, we have proposed novel methods for robust parsing that integrate the flexibili...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
. A grammar model for concurrent, object-oriented natural language parsing is introduced. Complete l...
After presenting a novel O(n^3) parsing algorithm for dependency grammar, we develop three contrasti...
Language models are an important component of speech recognition. They aim to predict the probabilit...
Today, the top performing parsing algorithms rely on the availability of annotated data for learning...
Language models (LMs) are essential components of many applications such as speech recognition or ma...
Natural language processing requires flexible control of computation on various sorts of constraints...
This technical report concerns the development of a probabilistic Constraint Dependency Grammar (CDG...
This study shows that using computational linguistic models is beneficial for descriptive linguistic...
In this paper we propose a new approach to language modeling based on dynamic Bayesian networks. The...