In this paper, we propose a novel recursive recurrent neural network (R2NN) to mod-el the end-to-end decoding process for s-tatistical machine translation. R2NN is a combination of recursive neural network and recurrent neural network, and in turn integrates their respective capabilities: (1) new information can be used to generate the next hidden state, like recurrent neu-ral networks, so that language model and translation model can be integrated natu-rally; (2) a tree structure can be built, as recursive neural networks, so as to gener-ate the translation candidates in a bottom up manner. A semi-supervised training ap-proach is proposed to train the parameter-s, and the phrase pair embedding is ex-plored to model translation confidence d...
The conventional statistical machine translation (SMT) methods perform the decoding process by compo...
The goal of this thesis is to describe and build a system for neural machine translation. System is ...
Neural network language models are often trained by optimizing likelihood, but we would prefer to op...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this era of globalization, it is quite likely to come across people or community who do not share...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
With economic globalization and the rapid development of the Internet, the connections between diffe...
With economic globalization and the rapid development of the Internet, the connections between diffe...
Neural machine translation is a relatively new approach to statistical machine trans-lation based pu...
We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predict...
The inability to model long-distance depen-dency has been handicapping SMT for years. Specifically, ...
We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predict...
Partially inspired by successful applications of variational recurrent neural networks, we propose a...
The conventional statistical machine translation (SMT) methods perform the decoding process by compo...
The goal of this thesis is to describe and build a system for neural machine translation. System is ...
Neural network language models are often trained by optimizing likelihood, but we would prefer to op...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this paper, we propose a novel neu-ral network model called RNN Encoder– Decoder that consists of...
In this era of globalization, it is quite likely to come across people or community who do not share...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
This work presents two different trans-lation models using recurrent neural net-works. The first one...
With economic globalization and the rapid development of the Internet, the connections between diffe...
With economic globalization and the rapid development of the Internet, the connections between diffe...
Neural machine translation is a relatively new approach to statistical machine trans-lation based pu...
We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predict...
The inability to model long-distance depen-dency has been handicapping SMT for years. Specifically, ...
We introduce recurrent neural network-based Minimum Translation Unit (MTU) models which make predict...
Partially inspired by successful applications of variational recurrent neural networks, we propose a...
The conventional statistical machine translation (SMT) methods perform the decoding process by compo...
The goal of this thesis is to describe and build a system for neural machine translation. System is ...
Neural network language models are often trained by optimizing likelihood, but we would prefer to op...