In this paper we present a novel method for adaptation of a multi-layer perceptron neural network (MLP ANN). Nowadays, the adaptation of the ANN is usually done as an incremental retraining either of a subset or the complete set of the ANN parameters. However, since sometimes the amount of the adaptation data is quite small, there is a fundamental drawback of such approach – during retraining, the network parameters can be easily overfitted to the new data. There certainly are techniques that can help overcome this problem (earlystopping, cross-validation), however application of such techniques leads to more complex and possibly more data hungry training procedure. The proposed method approaches the problem from a different persp...
This paper describes, how to perform speaker adaptation for a hybrid large vocabulary speech recogni...
This work introduces a multiple connectionist architecture based on a mixture of Recurrent Neural Ne...
Rapid adaptation of deep neural networks (DNNs) with limited unsupervised data remains a significant...
In this paper we present a novel method for adaptation of a multi-layer perceptron neural network (...
In the paper we present two techniques improving the recognition accuracy of multilayer perceptron n...
Recent progress in acoustic modeling with deep neural network has significantly improved the perform...
It is today acknowledged that neural network language models outperform backoff language models in a...
A technique is proposed for the adaptation of automatic speech recognition systems using Hybrid mode...
Abstract—We investigate a multilayer perceptron (MLP) based hierarchical approach for task adaptatio...
This paper proposes a simple yet effective model-based neural network speaker adaptation technique t...
We investigate a multilayer perceptron (MLP) based hierarchical approach for task adaptation in auto...
Speaker adaptive training (SAT) of neural network acoustic models learns models in a way that makes ...
In this paper, we propose a novel method to adapt context-dependent deep neural network hidden Marko...
We investigate the concept of speaker adaptive training (SAT) in the context of deep neural network ...
Differences between training and testing conditions may significantly degrade recognition accuracy i...
This paper describes, how to perform speaker adaptation for a hybrid large vocabulary speech recogni...
This work introduces a multiple connectionist architecture based on a mixture of Recurrent Neural Ne...
Rapid adaptation of deep neural networks (DNNs) with limited unsupervised data remains a significant...
In this paper we present a novel method for adaptation of a multi-layer perceptron neural network (...
In the paper we present two techniques improving the recognition accuracy of multilayer perceptron n...
Recent progress in acoustic modeling with deep neural network has significantly improved the perform...
It is today acknowledged that neural network language models outperform backoff language models in a...
A technique is proposed for the adaptation of automatic speech recognition systems using Hybrid mode...
Abstract—We investigate a multilayer perceptron (MLP) based hierarchical approach for task adaptatio...
This paper proposes a simple yet effective model-based neural network speaker adaptation technique t...
We investigate a multilayer perceptron (MLP) based hierarchical approach for task adaptation in auto...
Speaker adaptive training (SAT) of neural network acoustic models learns models in a way that makes ...
In this paper, we propose a novel method to adapt context-dependent deep neural network hidden Marko...
We investigate the concept of speaker adaptive training (SAT) in the context of deep neural network ...
Differences between training and testing conditions may significantly degrade recognition accuracy i...
This paper describes, how to perform speaker adaptation for a hybrid large vocabulary speech recogni...
This work introduces a multiple connectionist architecture based on a mixture of Recurrent Neural Ne...
Rapid adaptation of deep neural networks (DNNs) with limited unsupervised data remains a significant...