This paper describes the methods behind the systems submitted by the University of Groningen for the WMT 2021 Unsupervised Machine Translation task for German–Lower Sorbian (DE–DSB): a high-resource language to a low-resource one. Our system uses a transformer encoder-decoder architecture in which we make three changes to the standard training procedure. First, our training focuses on two languages at a time, contrasting with a wealth of research on multilingual systems. Second, we introduce a novel method for initializing the vocabulary of an unseen language, achieving improvements of 3.2 BLEU for DE->DSB and 4.0 BLEU for DSB->DE.Lastly, we experiment with the order in which offline and online back-translation are used to train an un...
This article presents UnsupNMT, a 3-year project of which the first year has already been completed....
International audienceWe present a survey covering the state of the art in low-resource machine tran...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
This paper describes the methods behind the systems submitted by the University of Groningen for the...
This paper describes the methods behind the systems submitted by the University of Groningen for the...
This paper describes the methods behind the systems submitted by the University of Groningen for the...
In this paper, we present the systems submitted by our team from the Institute of ICT (HEIG-VD / HES...
This paper describes the participation of the BSC team in the WMT2021{'}s Multilingual Low-Resource ...
Unsupervised Machine Translation hasbeen advancing our ability to translatewithout parallel data, bu...
Natural language processing of Low-Resource Languages (LRL) is often challenged by the lack of data....
The scarcity of parallel data is a major limitation for Neural Machine Translation (NMT) systems, in...
192 p.Modern machine translation relies on strong supervision in the form of parallel corpora. Such ...
2019-02-14We provide new tools and techniques for improving machine translation for low-resource lan...
There are several approaches for improving neural machine translation for low-resource languages: mo...
This article presents UnsupNMT, a 3-year project of which the first year has already been completed....
International audienceWe present a survey covering the state of the art in low-resource machine tran...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
This paper describes the methods behind the systems submitted by the University of Groningen for the...
This paper describes the methods behind the systems submitted by the University of Groningen for the...
This paper describes the methods behind the systems submitted by the University of Groningen for the...
In this paper, we present the systems submitted by our team from the Institute of ICT (HEIG-VD / HES...
This paper describes the participation of the BSC team in the WMT2021{'}s Multilingual Low-Resource ...
Unsupervised Machine Translation hasbeen advancing our ability to translatewithout parallel data, bu...
Natural language processing of Low-Resource Languages (LRL) is often challenged by the lack of data....
The scarcity of parallel data is a major limitation for Neural Machine Translation (NMT) systems, in...
192 p.Modern machine translation relies on strong supervision in the form of parallel corpora. Such ...
2019-02-14We provide new tools and techniques for improving machine translation for low-resource lan...
There are several approaches for improving neural machine translation for low-resource languages: mo...
This article presents UnsupNMT, a 3-year project of which the first year has already been completed....
International audienceWe present a survey covering the state of the art in low-resource machine tran...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...