For resource rich languages, recent works have shown Neural Network based Language Models (NNLMs) to be an effective modeling technique for Automatic Speech Recognition, out performing standard n-gram language models (LMs). For low resource languages, however, the performance of NNLMs has not been well explored. In this paper, we evaluate the effectiveness of NNLMs for low resource languages and show that NNLMs learn better word probabilities than state-of-theart n-gram models even when the amount of training data is severely limited. We show that interpolated NNLMs obtain a lower WER than standard n-gram models, no mater the amount of training data. Additionally, we observe that with small amounts of data (approx. 100k training tokens), fe...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
International audienceThis paper reports on investigations using two techniques for language model t...
Recurrent neural network language models (RNNLMs) are powerful language modeling techniques. Signifi...
<p>For resource rich languages, recent works have shown Neural Network based Language Models (NNLMs)...
For resource rich languages, recent works have shown Neu-ral Network based Language Models (NNLMs) t...
This paper investigates very low resource language model pretraining, when less than 100 thousand se...
Recently there is growing interest in using neural networks for language modeling. In contrast to th...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
One particular problem in large vocabulary continuous speech recognition for low-resourced languages...
This paper explores state-of-the-art techniques for creating language models in low-resource setting...
The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce per...
Language Models are an integral part of many applications like speech recognition, machine translati...
Since the advent of deep learning, automatic speech recognition (ASR), like many other fields, has a...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
International audienceThis paper reports on investigations using two techniques for language model t...
Recurrent neural network language models (RNNLMs) are powerful language modeling techniques. Signifi...
<p>For resource rich languages, recent works have shown Neural Network based Language Models (NNLMs)...
For resource rich languages, recent works have shown Neu-ral Network based Language Models (NNLMs) t...
This paper investigates very low resource language model pretraining, when less than 100 thousand se...
Recently there is growing interest in using neural networks for language modeling. In contrast to th...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Currently, N-gram models are the most common and widely used models for statistical language modelin...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
One particular problem in large vocabulary continuous speech recognition for low-resourced languages...
This paper explores state-of-the-art techniques for creating language models in low-resource setting...
The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce per...
Language Models are an integral part of many applications like speech recognition, machine translati...
Since the advent of deep learning, automatic speech recognition (ASR), like many other fields, has a...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
International audienceThis paper reports on investigations using two techniques for language model t...
Recurrent neural network language models (RNNLMs) are powerful language modeling techniques. Signifi...