The utility of linguistic annotation in neural machine translation seemed to had been established in past papers. The experiments were however limited to recurrent sequence-to-sequence architectures and relatively small data settings. We focus on the state-of-the-art Transformer model and use comparably larger corpora. Specifically, we try to promote the knowledge of source-side syntax using multi-task learning either through simple data manipulation techniques or through a dedicated model component. In particular, we train one of Transformer attention heads to produce source-side dependency tree. Overall, our results cast some doubt on the utility of multi-task setups with linguistic information. The data manipulation techniques, recomme...
The Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, ...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
The integration of syntactic structures into Transformer machine translation has shown positive resu...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Neural machine translation has been lately established as the new state of the art in machine transl...
In interactive machine translation (MT), human translators correct errors in automatic translations ...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Transformer-based models have brought a radical change to neural machine translation. A key feature ...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
In interactive machine translation (MT), human translators correct errors in auto- matic translation...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
Neural machine translation (NMT) has become the de facto standard in the machine translation communi...
This paper presents an extension of neural machine translation (NMT) model to incorporate additional...
Machine translation has received significant attention in the field of natural language processing n...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
The Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, ...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
The integration of syntactic structures into Transformer machine translation has shown positive resu...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
Neural machine translation has been lately established as the new state of the art in machine transl...
In interactive machine translation (MT), human translators correct errors in automatic translations ...
Introducing factors such as linguistic features has long been proposed in machine translation to imp...
Transformer-based models have brought a radical change to neural machine translation. A key feature ...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
In interactive machine translation (MT), human translators correct errors in auto- matic translation...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
Neural machine translation (NMT) has become the de facto standard in the machine translation communi...
This paper presents an extension of neural machine translation (NMT) model to incorporate additional...
Machine translation has received significant attention in the field of natural language processing n...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
The Transformer translation model (Vaswani et al., 2017), which relies on selfattention mechanisms, ...
In Neural Machine Translation (NMT), each token prediction is conditioned on the source sentence and...
The integration of syntactic structures into Transformer machine translation has shown positive resu...