This work aims to produce translations that convey source language content at a formality level that is appropriate for a particular audience. Framing this problem as a neural sequence-to-sequence task ideally requires training triplets consisting of a bilingual sentence pair labeled with target language formality. However, in practice, available training examples are limited to English sentence pairs of different styles, and bilingual parallel sentences of unknown formality. We introduce a novel training scheme for multi-task models that automatically generates synthetic training triplets by inferring the missing element on the fly, thus enabling end-to-end training. Comprehensive automatic and human assessments show that our best model ou...
Recent developments in machine translation experiment with the idea that a model can improve the tra...
Conventional interactive machine translation typically requires a human translator to validate every...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
Neural text simplification has gained increasing attention in the NLP community thanksto recen...
Pre-training and fine-tuning have achieved great success in natural language process field. The stan...
Neural machine translation requires large amounts of parallel training text to learn a reasonable-qu...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
We propose to achieve explainable neural machine translation (NMT) by changing the output representa...
Pre-training and fine-tuning have become the de facto paradigm in many natural language processing (...
Monolingual data have been demonstrated to be helpful in improving translation quality of both stati...
Semi-supervised learning algorithms in neural machine translation (NMT) have significantly improved ...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
This paper describes the University of Maryland's submission to the Special Task on Formality Contro...
UNMT tackles translation on monolingual corpora in two required languages. Since there is no explici...
Recent developments in machine translation experiment with the idea that a model can improve the tra...
Conventional interactive machine translation typically requires a human translator to validate every...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
Differently from the traditional statistical MT that decomposes the translation task into distinct s...
Neural text simplification has gained increasing attention in the NLP community thanksto recen...
Pre-training and fine-tuning have achieved great success in natural language process field. The stan...
Neural machine translation requires large amounts of parallel training text to learn a reasonable-qu...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
We propose to achieve explainable neural machine translation (NMT) by changing the output representa...
Pre-training and fine-tuning have become the de facto paradigm in many natural language processing (...
Monolingual data have been demonstrated to be helpful in improving translation quality of both stati...
Semi-supervised learning algorithms in neural machine translation (NMT) have significantly improved ...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
This paper describes the University of Maryland's submission to the Special Task on Formality Contro...
UNMT tackles translation on monolingual corpora in two required languages. Since there is no explici...
Recent developments in machine translation experiment with the idea that a model can improve the tra...
Conventional interactive machine translation typically requires a human translator to validate every...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...