Meta-learning has been sufficiently validated to be beneficial for low-resource neural machine translation (NMT). However, we find that meta-trained NMT fails to improve the translation performance of the domain unseen at the meta-training stage. In this paper, we aim to alleviate this issue by proposing a novel meta-curriculum learning for domain adaptation in NMT. During meta-training, the NMT first learns the similar curricula from each domain to avoid falling into a bad local optimum early, and finally learns the curricula of individualities to improve the model robustness for learning domain-specific knowledge. Experimental results on 10 different low-resource domains show that meta-curriculum learning can improve the translation perfo...
Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Mach...
The key challenge of multi-domain translation lies in simultaneously encoding both the general knowl...
We examine the effects of particular orderings of sentence pairs on the on-line training of neural m...
We consider two problems of NMT domain adaptation using meta-learning. First, we want to reach domai...
Neural machine translation (NMT) models have achieved state-of-the-art translation quality with a la...
Curriculum learning hypothesizes that presenting training samples in a meaningful order to machine l...
We consider two problems of NMT domain adaptation using meta-learning. First, we want to reach domai...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
Unsupervised domain translation has recently achieved impressive performance with Generative Adversa...
Generalising to unseen domains is under-explored and remains a challenge in neural machine translati...
The competitive performance of neural machine translation (NMT) critically relies on large amounts o...
Neural machine translation (NMT) has achieved remarkable success in producing high-quality translati...
Machine Learning has revolutionized education by offering numerous practical applications. One such ...
Non-autoregressive translation (NAT) models remove the dependence on previous target tokens and gene...
We present an approach to neural machine translation (NMT) that supports multiple domains in a singl...
Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Mach...
The key challenge of multi-domain translation lies in simultaneously encoding both the general knowl...
We examine the effects of particular orderings of sentence pairs on the on-line training of neural m...
We consider two problems of NMT domain adaptation using meta-learning. First, we want to reach domai...
Neural machine translation (NMT) models have achieved state-of-the-art translation quality with a la...
Curriculum learning hypothesizes that presenting training samples in a meaningful order to machine l...
We consider two problems of NMT domain adaptation using meta-learning. First, we want to reach domai...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
Unsupervised domain translation has recently achieved impressive performance with Generative Adversa...
Generalising to unseen domains is under-explored and remains a challenge in neural machine translati...
The competitive performance of neural machine translation (NMT) critically relies on large amounts o...
Neural machine translation (NMT) has achieved remarkable success in producing high-quality translati...
Machine Learning has revolutionized education by offering numerous practical applications. One such ...
Non-autoregressive translation (NAT) models remove the dependence on previous target tokens and gene...
We present an approach to neural machine translation (NMT) that supports multiple domains in a singl...
Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Mach...
The key challenge of multi-domain translation lies in simultaneously encoding both the general knowl...
We examine the effects of particular orderings of sentence pairs on the on-line training of neural m...