Language models are an important component of speech recognition. They aim to predict the probability of any word sequence. Traditional n-gram language models consider only adjacent lexical dependencies and so ignore longer-spanning dependencies. One might capture these long-spanning dependencies by augmenting traditional models with head-dependency information. A head word is the critical element in a phrase. Dependents modify the head. Prior research has shown that head-dependency relationships acquired from an annotated corpus can improve n-gram models. We propose a probabilistic parser and language model which incorporate head-dependency relationships obtained from the unsupervised algorithm in [10]. Preliminary information theoretic me...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
We design a language model based on a generative dependency structure for sen-tences. The parameter ...
In this paper, an extension of n-grams is proposed. In this extension, the memory of the model (n) i...
We present the first application of the head-driven statistical parsing model of Collins (1999) as a...
We present the first application of the head-driven statistical parsing model of Collins (1999) as a...
This paper presents a novel approach that improves the robustness of prosody dependent language mode...
Grammar-based natural language processing has reached a level where it can `understand' language to ...
A new language model is presented which incorporates local N-gram dependencies with two important so...
After presenting a novel O(n^3) parsing algorithm for dependency grammar, we develop three contrasti...
Statistical models for parsing natural language have recently shown considerable success in broad-co...
Building models of language is a central task in natural language processing. Traditionally, languag...
Copyright c©1998 by The Association for Computational Linguistics The paper presents a language mode...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
This thesis contributes to the research domain of statistical language modeling. In this domain, the...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
We design a language model based on a generative dependency structure for sen-tences. The parameter ...
In this paper, an extension of n-grams is proposed. In this extension, the memory of the model (n) i...
We present the first application of the head-driven statistical parsing model of Collins (1999) as a...
We present the first application of the head-driven statistical parsing model of Collins (1999) as a...
This paper presents a novel approach that improves the robustness of prosody dependent language mode...
Grammar-based natural language processing has reached a level where it can `understand' language to ...
A new language model is presented which incorporates local N-gram dependencies with two important so...
After presenting a novel O(n^3) parsing algorithm for dependency grammar, we develop three contrasti...
Statistical models for parsing natural language have recently shown considerable success in broad-co...
Building models of language is a central task in natural language processing. Traditionally, languag...
Copyright c©1998 by The Association for Computational Linguistics The paper presents a language mode...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
This thesis contributes to the research domain of statistical language modeling. In this domain, the...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
We design a language model based on a generative dependency structure for sen-tences. The parameter ...
In this paper, an extension of n-grams is proposed. In this extension, the memory of the model (n) i...