Neural Machine Translation (NMT) can be improved by including document-level contextual information. For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner. The model is integrated in the original NMT architecture as another level of abstraction, conditioning on the NMT model’s own previous hidden states. Experiments show that hierarchical attention significantly improves the BLEU score over a strong NMT baseline with the state-of-the-art in context-aware methods, and that both the encoder and decoder benefit from context in complementary ways
An attentional mechanism has lately been used to improve neural machine transla-tion (NMT) by select...
Document classification is a basic problem in the field of natural language processing (NLP). In rec...
Current state‑of‑the‑art neural machine translation (NMT) architectures usually do not take documen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Fully Attentional Networks (FAN) like Transformer (Vaswani et al. 2017) has shown superior results i...
Machine translation (MT) aims to translate texts with minimal human involvement, and the utilization...
In translation, considering the document as a whole can help to resolve ambiguities and inconsistenc...
Document-level machine translation manages to outperform sentence level models by a small margin, bu...
Attention mechanism, including global attention and local attention, plays a key role in neural mach...
An attentional mechanism has lately been used to improve neural machine transla-tion (NMT) by select...
An attentional mechanism has lately been used to improve neural machine transla-tion (NMT) by select...
Document classification is a basic problem in the field of natural language processing (NLP). In rec...
Current state‑of‑the‑art neural machine translation (NMT) architectures usually do not take documen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluen...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Current state-of-the-art neural machine translation (NMT) architectures usually do not take document...
Fully Attentional Networks (FAN) like Transformer (Vaswani et al. 2017) has shown superior results i...
Machine translation (MT) aims to translate texts with minimal human involvement, and the utilization...
In translation, considering the document as a whole can help to resolve ambiguities and inconsistenc...
Document-level machine translation manages to outperform sentence level models by a small margin, bu...
Attention mechanism, including global attention and local attention, plays a key role in neural mach...
An attentional mechanism has lately been used to improve neural machine transla-tion (NMT) by select...
An attentional mechanism has lately been used to improve neural machine transla-tion (NMT) by select...
Document classification is a basic problem in the field of natural language processing (NLP). In rec...
Current state‑of‑the‑art neural machine translation (NMT) architectures usually do not take documen...