State-of-the-art multilingual machine translation relies on a shared encoder-decoder. In this paper, we propose an alternative approach based on language-specific encoder-decoders, which can be easily extended to new languages by learning their corresponding modules. To establish a common interlingua representation, we simultaneously train N initial languages. Our experiments show that the proposed approach improves over the shared encoder-decoder for the initial languages and when adding new languages, without the need to retrain the remaining modules. All in all, our work closes the gap between shared and language-specific encoder-decoders, advancing toward modular multilingual machine translation systems that can be flexibly extended in ...
Massively multilingual models for neural machine translation (NMT) are theoretically attractive, but...
Neural machine translation has considerably improved the quality of automatic translations by learni...
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Tr...
State-of-the-art multilingual machine translation relies on a universal encoder-decoder, which requi...
In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multiling...
Neural Machine Translation (NMT) has been shown to be more effective in translation tasks compared t...
We propose a machine translation architecture based on autoencoders and a shared interlingua represe...
This paper describes the participation of the BSC team in the WMT2021{'}s Multilingual Low-Resource ...
Neural Machine Translation has been shown to enable in-ference and cross-lingual knowledge transfer ...
A common intermediate language representation in neural machine translation can be used to extend bi...
Zero-shot translation is a transfer learning setup that refers to the ability of neural machine tran...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
In this paper, we propose an architecture for machine translation (MT) capable of obtaining multilin...
Attention-based Encoder-Decoder has the effective architecture for neural machine translation (NMT),...
NLP systems typically require support for more than one language. As different languages have differ...
Massively multilingual models for neural machine translation (NMT) are theoretically attractive, but...
Neural machine translation has considerably improved the quality of automatic translations by learni...
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Tr...
State-of-the-art multilingual machine translation relies on a universal encoder-decoder, which requi...
In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multiling...
Neural Machine Translation (NMT) has been shown to be more effective in translation tasks compared t...
We propose a machine translation architecture based on autoencoders and a shared interlingua represe...
This paper describes the participation of the BSC team in the WMT2021{'}s Multilingual Low-Resource ...
Neural Machine Translation has been shown to enable in-ference and cross-lingual knowledge transfer ...
A common intermediate language representation in neural machine translation can be used to extend bi...
Zero-shot translation is a transfer learning setup that refers to the ability of neural machine tran...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
In this paper, we propose an architecture for machine translation (MT) capable of obtaining multilin...
Attention-based Encoder-Decoder has the effective architecture for neural machine translation (NMT),...
NLP systems typically require support for more than one language. As different languages have differ...
Massively multilingual models for neural machine translation (NMT) are theoretically attractive, but...
Neural machine translation has considerably improved the quality of automatic translations by learni...
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Tr...