Recurrent neural network language models (RNNLM) have become an increasingly popular choice for state-of-the-art speech recogni-tion systems due to their inherently strong generalization perfor-mance. As these models use a vector representation of complete history contexts, RNNLMs are normally used to rescore N-best lists. Motivated by their intrinsic characteristics, two novel lattice rescor-ing methods for RNNLMs are investigated in this paper. The first uses an n-gram style clustering of history contexts. The second ap-proach directly exploits the distance measure between hidden history vectors. Both methods produced 1-best performance comparable with a 10k-best rescoring baseline RNNLM system on a large vocab-ulary conversational teleph...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
The goal of this thesis is to advance the use of recurrent neural network language models (RNNLMs) ...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
An important part of the language modelling problem for automatic speech recognition (ASR) systems, ...
An important part of the language modelling problem for automatic speech recognition (ASR) systems, ...
Recurrent neural network language models (RNNLMs) have recently shown to outperform the venerable n-...
Recently, bidirectional recurrent network language models (bi-RNNLMs) have been shown to outperform ...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
In recent years, recurrent neural network language models (RNNLMs) have become increasingly popular ...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
Recurrent neural network language models (RNNLMs) are powerful language modeling techniques. Signifi...
Recurrent neural networks (RNNs) are a powerful model for sequential data. End-to-end training metho...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
The goal of this thesis is to advance the use of recurrent neural network language models (RNNLMs) ...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for stat...
An important part of the language modelling problem for automatic speech recognition (ASR) systems, ...
An important part of the language modelling problem for automatic speech recognition (ASR) systems, ...
Recurrent neural network language models (RNNLMs) have recently shown to outperform the venerable n-...
Recently, bidirectional recurrent network language models (bi-RNNLMs) have been shown to outperform ...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
In recent years, recurrent neural network language models (RNNLMs) have become increasingly popular ...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
Recurrent neural network language models (RNNLMs) are powerful language modeling techniques. Signifi...
Recurrent neural networks (RNNs) are a powerful model for sequential data. End-to-end training metho...
Language modeling is a crucial component in a wide range of applications including speech recognitio...
© 2016 IEEE. In recent years, research on language modeling for speech recognition has increasingly ...
The goal of this thesis is to advance the use of recurrent neural network language models (RNNLMs) ...