We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predictions based on an unbounded history of previous bilin-gual contexts. Traditional back-off n-gram models suffer under the sparse nature of MTUs which makes estimation of high-order sequence models challenging. We tackle the sparsity problem by modeling MTUs both as bags-of-words and as a sequence of individual source and target words. Our best results improve the out-put of a phrase-based statistical machine translation system trained on WMT 2012 French-English data by up to 1.5 BLEU, and we outperform the traditional n-gram based MTU approach by up to 0.8 BLEU.
In this paper, we propose a novel recursive recurrent neural network (R2NN) to mod-el the end-to-end...
The requirement for neural machine translation (NMT) models to use fixed-size input and output vocab...
We present a joint language and transla-tion model based on a recurrent neural net-work which predic...
We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predict...
Recurrent neural networks (RNNs) have represented for years the state of the art in neural machine t...
Recurrent neural networks (RNNs) have represented for years the state of the art in neural machine t...
The inability to model long-distance depen-dency has been handicapping SMT for years. Specifically, ...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
The quality of translations produced by statistical machine translation (SMT) systems crucially dep...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this paper, we propose a novel recursive recurrent neural network (R2NN) to mod-el the end-to-end...
The requirement for neural machine translation (NMT) models to use fixed-size input and output vocab...
We present a joint language and transla-tion model based on a recurrent neural net-work which predic...
We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predict...
Recurrent neural networks (RNNs) have represented for years the state of the art in neural machine t...
Recurrent neural networks (RNNs) have represented for years the state of the art in neural machine t...
The inability to model long-distance depen-dency has been handicapping SMT for years. Specifically, ...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
Virtually any modern speech recognition system relies on count-based language models. In this thesis...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
The quality of translations produced by statistical machine translation (SMT) systems crucially dep...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this paper, we propose a novel recursive recurrent neural network (R2NN) to mod-el the end-to-end...
The requirement for neural machine translation (NMT) models to use fixed-size input and output vocab...
We present a joint language and transla-tion model based on a recurrent neural net-work which predic...