It is today acknowledged that neural network language models outperform backoff language models in applications like speech recognition or statistical machine translation. However, training these models on large amounts of data can take several days. We present efficient techniques to adapt a neural network language model to new data. Instead of training a completely new model or relying on mixture approaches, we propose two new methods: continued training on resampled data or insertion of adaptation layers. We present experimental results in an CAT environment where the post-edits of professional translators are used to improve an SMT system. Both methods are very fast and achieve significant improvements without overfitting the small adap...
Language modeling is critical and indispensable for many natural language ap-plications such as auto...
This work is inspired by a typical machine translation industry scenario in which translators make u...
Neural network language models (NNLMs) have attracted a lot of attention recently. In this paper, we...
Neural network training has been shown to be advantageous in many natural language processing appli...
Improving machine translation (MT) by learning from human post-edits is a powerful solution that is ...
This work investigates a crucial aspect for the integration of MT technology into a CAT environment,...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
In this paper, we propose a new domain adaptation technique for neural machine translation called co...
In this paper we present a novel method for adaptation of a multi-layer perceptron neural network (...
Data selection is an effective approach to domain adaptation in statistical ma-chine translation. Th...
In this paper we address the issue of building language models for very small training sets by adapt...
Neural machine translation (NMT) has been shown to outperform statistical machine translation. Howe...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
Language models are a critical component of an automatic speech recognition (ASR) system. Neural net...
NLP technologies are uneven for the world's languages as the state-of-the-art models are only availa...
Language modeling is critical and indispensable for many natural language ap-plications such as auto...
This work is inspired by a typical machine translation industry scenario in which translators make u...
Neural network language models (NNLMs) have attracted a lot of attention recently. In this paper, we...
Neural network training has been shown to be advantageous in many natural language processing appli...
Improving machine translation (MT) by learning from human post-edits is a powerful solution that is ...
This work investigates a crucial aspect for the integration of MT technology into a CAT environment,...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
In this paper, we propose a new domain adaptation technique for neural machine translation called co...
In this paper we present a novel method for adaptation of a multi-layer perceptron neural network (...
Data selection is an effective approach to domain adaptation in statistical ma-chine translation. Th...
In this paper we address the issue of building language models for very small training sets by adapt...
Neural machine translation (NMT) has been shown to outperform statistical machine translation. Howe...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
Language models are a critical component of an automatic speech recognition (ASR) system. Neural net...
NLP technologies are uneven for the world's languages as the state-of-the-art models are only availa...
Language modeling is critical and indispensable for many natural language ap-plications such as auto...
This work is inspired by a typical machine translation industry scenario in which translators make u...
Neural network language models (NNLMs) have attracted a lot of attention recently. In this paper, we...