International audienceThis study examines how to take originally advantage from distant information instatistical language models. We show that it is possible to use n-gram models considering histories different from those used during training. These models are called crossing context models. Our study deals with classical and distant n-gram models. A mixture of four models is proposed and evaluated. A bigram linear mixture achieves an improvement of 14% in terms of perplexity. Moreover the trigram mixture outperforms the standard trigram by 5.6%. These improvements have been obtained without complexifying standard n-gram models. The resulting mixture language model has been integrated into a speech recognition system. Its evaluation achiev...
Colloque avec actes et comité de lecture. internationale.International audienceStatistical language ...
Since the advent of deep learning, automatic speech recognition (ASR), like many other fields, has a...
Grammar-based natural language processing has reached a level where it can `understand' language to ...
Colloque avec actes et comité de lecture. internationale.International audienceClassical statistical...
Texte intégral accessible uniquement aux membres de l'Université de LorraineA statistical language m...
In this paper we examine several combinations of classical N-gram language models with more advanced...
International audienceThis article presents a study on how to automatically add new words into a lan...
Colloque avec actes et comité de lecture. internationale.International audienceIn this paper we prop...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
In this paper, an extension of n-grams, called x-grams, is proposed. In this extension, the memory o...
In domains with insufficient matched training data, language models are often constructed by interpo...
© 2016 Association for Computational Linguistics. In this paper we improve over the hierarchical Pit...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...
Pelemans J., ''Efficient language modeling for automatic speech recognition'', Proefschrift voorgedr...
This thesis investigates an approach to exploiting the long context based on the information about t...
Colloque avec actes et comité de lecture. internationale.International audienceStatistical language ...
Since the advent of deep learning, automatic speech recognition (ASR), like many other fields, has a...
Grammar-based natural language processing has reached a level where it can `understand' language to ...
Colloque avec actes et comité de lecture. internationale.International audienceClassical statistical...
Texte intégral accessible uniquement aux membres de l'Université de LorraineA statistical language m...
In this paper we examine several combinations of classical N-gram language models with more advanced...
International audienceThis article presents a study on how to automatically add new words into a lan...
Colloque avec actes et comité de lecture. internationale.International audienceIn this paper we prop...
It seems obvious that a successful model of natural language would incorporate a great deal of both ...
In this paper, an extension of n-grams, called x-grams, is proposed. In this extension, the memory o...
In domains with insufficient matched training data, language models are often constructed by interpo...
© 2016 Association for Computational Linguistics. In this paper we improve over the hierarchical Pit...
International audienceThis paper describes an extension of the n-gram language model: the similar n-...
Pelemans J., ''Efficient language modeling for automatic speech recognition'', Proefschrift voorgedr...
This thesis investigates an approach to exploiting the long context based on the information about t...
Colloque avec actes et comité de lecture. internationale.International audienceStatistical language ...
Since the advent of deep learning, automatic speech recognition (ASR), like many other fields, has a...
Grammar-based natural language processing has reached a level where it can `understand' language to ...