This paper describes two techniques for reducing the size of statistical back-off-gram language models in computer memory. Language model compression is achieved through a combination of quantizing language model probabilities and back-off weights and the pruning of parameters that are determined to be unnecessary after quantization. The recognition performance of the original and compressed language models is evaluated across three different language models and two different recognition tasks. The results show that the language models can be compressed by up to 60 % of their original size with no significant loss in recognition performance. Moreover, the techniques that are described provide a principled method with which to compress langu...
Despite achieving state-of-the-art performance on many NLP tasks, the high energy cost and long infe...
Natural Language Processing (NLP) has seen tremendous improvements over the last few years. Transfor...
The best general-purpose compression schemes make their gains by estimating a probability distributi...
This paper describes a novel approach of compressing large trigram language models, which uses scala...
In this paper, a new n-gram language model compression method is proposed for applications in handhe...
N-gram language models are an essential component in statistical natural language processing systems...
ICSLP1998: the 5th International Conference on Spoken Language Processing, November 30 - December 4...
Language modeling is an important part for both speech recognition and machine translation systems. ...
In this paper, an extension of n-grams is proposed. In this extension, the memory of the model (n) i...
Copyright © 2015 ISCA. Direct integration of translation model (TM) probabilities into a language mo...
Research in speech recognition and machine translation is boosting the use of large scale n-gram lan...
Language model in Natural Language Processing is one of the most important fields carried out in the...
Though the statistical language modeling plays an important role in speech recognition, there are st...
We introduce factored language models (FLMs) and generalized parallel backoff (GPB). An FLM represen...
This paper reports on the benefits of largescale statistical language modeling in machine translatio...
Despite achieving state-of-the-art performance on many NLP tasks, the high energy cost and long infe...
Natural Language Processing (NLP) has seen tremendous improvements over the last few years. Transfor...
The best general-purpose compression schemes make their gains by estimating a probability distributi...
This paper describes a novel approach of compressing large trigram language models, which uses scala...
In this paper, a new n-gram language model compression method is proposed for applications in handhe...
N-gram language models are an essential component in statistical natural language processing systems...
ICSLP1998: the 5th International Conference on Spoken Language Processing, November 30 - December 4...
Language modeling is an important part for both speech recognition and machine translation systems. ...
In this paper, an extension of n-grams is proposed. In this extension, the memory of the model (n) i...
Copyright © 2015 ISCA. Direct integration of translation model (TM) probabilities into a language mo...
Research in speech recognition and machine translation is boosting the use of large scale n-gram lan...
Language model in Natural Language Processing is one of the most important fields carried out in the...
Though the statistical language modeling plays an important role in speech recognition, there are st...
We introduce factored language models (FLMs) and generalized parallel backoff (GPB). An FLM represen...
This paper reports on the benefits of largescale statistical language modeling in machine translatio...
Despite achieving state-of-the-art performance on many NLP tasks, the high energy cost and long infe...
Natural Language Processing (NLP) has seen tremendous improvements over the last few years. Transfor...
The best general-purpose compression schemes make their gains by estimating a probability distributi...