In Machine Translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a simple yet promising approach to add contextual information in Neural Machine Translation. We present a method to add source context that capture the whole document with accurate boundaries, taking every word into account. We provide this additional information to a Transformer model and study the impact of our method on three language pairs. The proposed approach obtains promising results in the English-German, English-French and French-English document-level translation tasks. We observe interesting cross-sentential behaviors where the model learns to use document-level information to improve transla...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Machine translation presents its root in the domain of textual processing that focuses on the usage ...
International audienceIn Machine Translation, considering the document as a whole can help to resolv...
International audienceIn Machine Translation, considering the document as a whole can help to resolv...
Considerable research has been conducted to obtain translations that reflect contextual information ...
Existing work in document-level neural machine translation commonly concatenates several consecutive...
We present a document-level neural machine translation model which takes bothsource and target docum...
Document-level machine translation manages to outperform sentence level models by a small margin, bu...
In translation, considering the document as a whole can help to resolve ambiguities and inconsistenc...
Neural networks learn patterns from data to solve complex problems. To understand and infer meaning ...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Machine translation presents its root in the domain of textual processing that focuses on the usage ...
International audienceIn Machine Translation, considering the document as a whole can help to resolv...
International audienceIn Machine Translation, considering the document as a whole can help to resolv...
Considerable research has been conducted to obtain translations that reflect contextual information ...
Existing work in document-level neural machine translation commonly concatenates several consecutive...
We present a document-level neural machine translation model which takes bothsource and target docum...
Document-level machine translation manages to outperform sentence level models by a small margin, bu...
In translation, considering the document as a whole can help to resolve ambiguities and inconsistenc...
Neural networks learn patterns from data to solve complex problems. To understand and infer meaning ...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Machine translation presents its root in the domain of textual processing that focuses on the usage ...