In this paper, we des ribe a unied probabilisti framework for statisti al language modeling|the latent maximum en-tropy prin iple|whi h an ee tively in orporate various aspe ts of natural language, su h as lo al word intera-tion, synta ti stru ture and semanti do ument informa-tion. Unlike previous work on maximum entropy methods for language modeling, whi h only allow expli it features to be modeled, our framework also allows relationships over hidden features to be aptured, resulting in a more ex-pressive language model. We des ribe eÆ ient algorithms for marginalization, inferen e and normalization in our ex-tended models. We then present experimental results for our approa h on the Wall Street Journal orpus. 1
We desert'be our latest attempt at adaptive language modeling. At the heart of our approach isa...
A compact language model which incorporates local dependencies in the form of N-grams and long dista...
We present a framework for statistical machine translation of natural languages based on direct maxi...
We describe a unified probabilistic framework for statistical language modeling-the latent maximum e...
The conventional n-gram language model exploits only the immediate context of historical words witho...
Many problems in natural language processing can be viewed as linguistic classification problems, in...
Language modeling is the attempt to characterize, capture and exploit regularities in natural langua...
1 Introduction The goal of a language model is to estimate the probability P (W) of a word sequence ...
This thesis demonstrates that several important kinds of natural language ambiguities can be resolve...
This thesis demonstrates that several important kinds of natural language ambiguities can be resolve...
The concept of maximum entropy can be traced back along multiple threads to Biblical times. Only rec...
This thesis demonstrates that several important kinds of natural language ambiguities can be resolve...
We present a maximum entropy approach to topic sensitive language modeling. By classifying the train...
A new language model is presented which incorporates local N-gram dependencies with two important so...
Many problems in natural language processing can be viewed as linguistic classification problems, in...
We desert'be our latest attempt at adaptive language modeling. At the heart of our approach isa...
A compact language model which incorporates local dependencies in the form of N-grams and long dista...
We present a framework for statistical machine translation of natural languages based on direct maxi...
We describe a unified probabilistic framework for statistical language modeling-the latent maximum e...
The conventional n-gram language model exploits only the immediate context of historical words witho...
Many problems in natural language processing can be viewed as linguistic classification problems, in...
Language modeling is the attempt to characterize, capture and exploit regularities in natural langua...
1 Introduction The goal of a language model is to estimate the probability P (W) of a word sequence ...
This thesis demonstrates that several important kinds of natural language ambiguities can be resolve...
This thesis demonstrates that several important kinds of natural language ambiguities can be resolve...
The concept of maximum entropy can be traced back along multiple threads to Biblical times. Only rec...
This thesis demonstrates that several important kinds of natural language ambiguities can be resolve...
We present a maximum entropy approach to topic sensitive language modeling. By classifying the train...
A new language model is presented which incorporates local N-gram dependencies with two important so...
Many problems in natural language processing can be viewed as linguistic classification problems, in...
We desert'be our latest attempt at adaptive language modeling. At the heart of our approach isa...
A compact language model which incorporates local dependencies in the form of N-grams and long dista...
We present a framework for statistical machine translation of natural languages based on direct maxi...