Omission and addition of content is a typical issue in neural machine translation. We propose a method for detecting such phenomena with off-the-shelf translation models. Using contrastive conditioning, we compare the likelihood of a full sequence under a translation model to the likelihood of its parts, given the corresponding source or target sequence. This allows to pinpoint superfluous words in the translation and untranslated words in the source even in the absence of a reference translation. The accuracy of our method is comparable to a supervised method that requires a custom quality estimation model
Although end-to-end Neural Machine Translation (NMT) has achieved remarkable progress in the past tw...
In Neural Machine Translation (NMT), data augmentation methods such as back-translation have proven ...
Neural machine translation models usually adopt the teacher forcing strategy for training which requ...
Omission and addition of content is a typical issue in neural machine translation. We propose a meth...
Lexical disambiguation is a major challenge for machine translation systems, especially if some sens...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
Neural language models do not scale well when the vocabulary is large. Noise contrastive estimation ...
Low-frequency word prediction remains a challenge in modern neural machine translation (NMT) systems...
We address the task of automatically distinguishing between human-translated (HT) and machine transl...
Neural Machine Translation has achieved state-of-the-art performance for several language pairs usin...
Neural Machine Translation (NMT) has drawn much attention due to its promising translation performan...
Although Neural Machine Translation (NMT) models have advanced state-of-the-art performance in machi...
As a new neural machine translation approach, NonAutoregressive machine Translation (NAT) has attrac...
Neural language models do not scale well when the vocabulary is large. Noise-contrastive estimation ...
This work aims to produce translations that convey source language content at a formality level that...
Although end-to-end Neural Machine Translation (NMT) has achieved remarkable progress in the past tw...
In Neural Machine Translation (NMT), data augmentation methods such as back-translation have proven ...
Neural machine translation models usually adopt the teacher forcing strategy for training which requ...
Omission and addition of content is a typical issue in neural machine translation. We propose a meth...
Lexical disambiguation is a major challenge for machine translation systems, especially if some sens...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
Neural language models do not scale well when the vocabulary is large. Noise contrastive estimation ...
Low-frequency word prediction remains a challenge in modern neural machine translation (NMT) systems...
We address the task of automatically distinguishing between human-translated (HT) and machine transl...
Neural Machine Translation has achieved state-of-the-art performance for several language pairs usin...
Neural Machine Translation (NMT) has drawn much attention due to its promising translation performan...
Although Neural Machine Translation (NMT) models have advanced state-of-the-art performance in machi...
As a new neural machine translation approach, NonAutoregressive machine Translation (NAT) has attrac...
Neural language models do not scale well when the vocabulary is large. Noise-contrastive estimation ...
This work aims to produce translations that convey source language content at a formality level that...
Although end-to-end Neural Machine Translation (NMT) has achieved remarkable progress in the past tw...
In Neural Machine Translation (NMT), data augmentation methods such as back-translation have proven ...
Neural machine translation models usually adopt the teacher forcing strategy for training which requ...