Machine translation has received significant attention in the field of natural language processing not only because of its challenges but also due to the translation needs that arise in the daily life of modern people. In this study, we design a new machine translation model named X-Transformer, which refines the original Transformer model regarding three aspects. First, the model parameter of the encoder is compressed. Second, the encoder structure is modified by adopting two layers of the self-attention mechanism consecutively and reducing the point-wise feed forward layer to help the model understand the semantic structure of sentences precisely. Third, we streamline the decoder model size, while maintaining the accuracy. Through experim...
Though early successes of Statistical Machine Translation (SMT) systems are attributed in part to th...
In this study, a human evaluation is carried out on how hyperparameter settings impact the quality o...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Neural machine translation has been lately established as the new state of the art in machine transl...
We explore the suitability of self-attention models for character-level neural machine translation. ...
Transformer-based models have brought a radical change to neural machine translation. A key feature ...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
In interactive machine translation (MT), human translators correct errors in automatic translations ...
With economic globalization and the rapid development of the Internet, the connections between diffe...
Machine Translation (MT) systems tend to underperform when faced with long, linguistically complex s...
The University of Cambridge submission to the WMT18 news translation task focuses on the combination...
The powerful modeling capabilities of all-attention-based transformer architectures often cause over...
Though early successes of Statistical Machine Translation (SMT) systems are attributed in part to th...
In this study, a human evaluation is carried out on how hyperparameter settings impact the quality o...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Neural machine translation has been lately established as the new state of the art in machine transl...
We explore the suitability of self-attention models for character-level neural machine translation. ...
Transformer-based models have brought a radical change to neural machine translation. A key feature ...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
The utility of linguistic annotation in neural machine translation seemed to had been established in...
In interactive machine translation (MT), human translators correct errors in automatic translations ...
With economic globalization and the rapid development of the Internet, the connections between diffe...
Machine Translation (MT) systems tend to underperform when faced with long, linguistically complex s...
The University of Cambridge submission to the WMT18 news translation task focuses on the combination...
The powerful modeling capabilities of all-attention-based transformer architectures often cause over...
Though early successes of Statistical Machine Translation (SMT) systems are attributed in part to th...
In this study, a human evaluation is carried out on how hyperparameter settings impact the quality o...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...