Previous neural machine translation models used some heuristic search algorithms (e.g., beam search) in order to avoid solving the maximum a posteriori problem over translation sentences at test phase. In this paper, we propose the \textit{Gumbel-Greedy Decoding} which trains a generative network to predict translation under a trained model. We solve such a problem using the Gumbel-Softmax reparameterization, which makes our generative network differentiable and trainable through standard stochastic gradient methods. We empirically demonstrate that our proposed model is effective for generating sequences of discrete words
Existing Sequence-to-Sequence (Seq2Seq) Neural Machine Translation (NMT) shows strong capability wit...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Neural Machine Translation (NMT) is a new approach to machine translation that has made great progre...
We explore the application of neural language models to machine translation. We develop a new model ...
We explore the application of neural language models to machine translation. We develop a new model ...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
International audienceIn this paper, we study the feasibility of using a neural network to learn a f...
Although Neural Machine Translation (NMT) models have advanced state-of-the-art performance in machi...
With economic globalization and the rapid development of the Internet, the connections between diffe...
We investigate the use of hierarchical phrase-based SMT lattices in end-to-end neural machine transl...
[EN] Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
In this paper, we extend an attention-based neural machine translation (NMT) model by allowing it to...
Neural machine translation (NMT) conducts end-to-end translation with a source language encoder and ...
We present a novel scheme to combine neural machine translation (NMT) with traditional statistical m...
Existing Sequence-to-Sequence (Seq2Seq) Neural Machine Translation (NMT) shows strong capability wit...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Neural Machine Translation (NMT) is a new approach to machine translation that has made great progre...
We explore the application of neural language models to machine translation. We develop a new model ...
We explore the application of neural language models to machine translation. We develop a new model ...
The quality of translations produced by statistical machine translation (SMT) systems crucially depe...
International audienceIn this paper, we study the feasibility of using a neural network to learn a f...
Although Neural Machine Translation (NMT) models have advanced state-of-the-art performance in machi...
With economic globalization and the rapid development of the Internet, the connections between diffe...
We investigate the use of hierarchical phrase-based SMT lattices in end-to-end neural machine transl...
[EN] Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
In this paper, we extend an attention-based neural machine translation (NMT) model by allowing it to...
Neural machine translation (NMT) conducts end-to-end translation with a source language encoder and ...
We present a novel scheme to combine neural machine translation (NMT) with traditional statistical m...
Existing Sequence-to-Sequence (Seq2Seq) Neural Machine Translation (NMT) shows strong capability wit...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Neural Machine Translation (NMT) is a new approach to machine translation that has made great progre...