Recurrent neural network language models (RNNLMs) can be augmented with auxiliary features, which can provide an extra modality on top of the words. It has been found that RNNLMs perform best when trained on a large corpus of generic text and then fine-tuned on text corresponding to the sub-domain for which it is to be applied. However, in many cases the auxiliary features are available for the sub-domain text but not for the generic text. In such cases, semi-supervised techniques can be used to infer such features for the generic text data such that the RNNLM can be trained and then fine-tuned on the available in-domain data with corresponding auxiliary features. In this paper, several novel approaches are investigated for dealing with ...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
This research addresses the language model (LM) domain mismatch problem in automatic speech recognit...
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a d...
Recurrent neural network language models (RNNLMs) have consistently outperformed n-gram language mod...
Recurrent neural network language models (RNNLMs) generally outperform n-gram language models when u...
Recurrent neural network language models (RNNLMs) have been shown to consistently improve Word Error...
Publisher Copyright: Copyright © 2021 ISCA.Adaption of end-to-end speech recognition systems to new ...
Recurrent neural network language models (RNNLMs) have re-cently become increasingly popular for man...
The goal of this thesis is to advance the use of recurrent neural network language models (RNNLMs) ...
Thesis (Ph.D.)--University of Washington, 2018A long-standing weakness of statistical language model...
The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce per...
Training domain-specific automatic speech recognition (ASR) systems requires a suitable amount of da...
ABSTRACT We present several modifications of the original recurrent neural network language model (R...
Neural network training has been shown to be advantageous in many natural language processing appli...
With the fast growth of the amount of digitalized texts in recent years, text information management...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
This research addresses the language model (LM) domain mismatch problem in automatic speech recognit...
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a d...
Recurrent neural network language models (RNNLMs) have consistently outperformed n-gram language mod...
Recurrent neural network language models (RNNLMs) generally outperform n-gram language models when u...
Recurrent neural network language models (RNNLMs) have been shown to consistently improve Word Error...
Publisher Copyright: Copyright © 2021 ISCA.Adaption of end-to-end speech recognition systems to new ...
Recurrent neural network language models (RNNLMs) have re-cently become increasingly popular for man...
The goal of this thesis is to advance the use of recurrent neural network language models (RNNLMs) ...
Thesis (Ph.D.)--University of Washington, 2018A long-standing weakness of statistical language model...
The recurrent neural network language model (RNNLM) has been demonstrated to consistently reduce per...
Training domain-specific automatic speech recognition (ASR) systems requires a suitable amount of da...
ABSTRACT We present several modifications of the original recurrent neural network language model (R...
Neural network training has been shown to be advantageous in many natural language processing appli...
With the fast growth of the amount of digitalized texts in recent years, text information management...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
This research addresses the language model (LM) domain mismatch problem in automatic speech recognit...
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a d...