Language models (LMs) are essential components of many applications such as speech recognition or machine translation. LMs factorize the probability of a string of words into a product of P(w_i|h_i), where h_i is the context (history) of word w_i. Most LMs use previous words as the context. The paper presents two alternative approaches: post-ngram LMs (which use following words as context) and dependency LMs (which exploit dependency structure of a sentence and can use e.g. the governing word as context). Dependency LMs could be useful whenever a topology of a dependency tree is available, but its lexical labels are unknown, e.g. in tree-to-tree machine translation. In comparison with baseline interpolated trigram LM both of the approaches ...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
Dependency structure provides grammat-ical relations between words, which have shown to be effective...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...
A new language model is presented which incorporates local N-gram dependencies with two important so...
In language modeling, n-gram models are probabilistic models of text that use some limited amount of...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
Language models are an important component of speech recognition. They aim to predict the probabilit...
Introduction Recent advances in Information Retrieval are based on using Statistical Language Model...
This paper presents two techniques for language model (LM) adaptation. The first aims to build a mor...
Standard phrase-based translation models do not explicitly model context dependence be-tween transla...
Sentence completion is a challenging seman-tic modeling task in which models must choose the most ap...
We present a novel, structured language model- Supertagged Dependency Language Model to model the sy...
International audienceIn statistical language modelling the classic model used is $n$-gram. This mod...
Language modeling is to associate a sequence of words with a priori probability, which is a key part...
Conventional n-gram language models are well-established as powerful yet simple mechanisms for chara...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
Dependency structure provides grammat-ical relations between words, which have shown to be effective...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...
A new language model is presented which incorporates local N-gram dependencies with two important so...
In language modeling, n-gram models are probabilistic models of text that use some limited amount of...
This thesis focuses on the development of effective and efficient language models (LMs) for speech r...
Language models are an important component of speech recognition. They aim to predict the probabilit...
Introduction Recent advances in Information Retrieval are based on using Statistical Language Model...
This paper presents two techniques for language model (LM) adaptation. The first aims to build a mor...
Standard phrase-based translation models do not explicitly model context dependence be-tween transla...
Sentence completion is a challenging seman-tic modeling task in which models must choose the most ap...
We present a novel, structured language model- Supertagged Dependency Language Model to model the sy...
International audienceIn statistical language modelling the classic model used is $n$-gram. This mod...
Language modeling is to associate a sequence of words with a priori probability, which is a key part...
Conventional n-gram language models are well-established as powerful yet simple mechanisms for chara...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
Dependency structure provides grammat-ical relations between words, which have shown to be effective...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...