International audienceAdapter modules were recently introduced as an efficient alternative to fine-tuning in NLP. Adapter tuning consists in freezing pretrained parameters of a model and injecting lightweight modules between layers, resulting in the addition of only a small number of taskspecific trainable parameters. While adapter tuning was investigated for multilingual neural machine translation, this paper proposes a comprehensive analysis of adapters for multilingual speech translation (ST). Starting from different pre-trained models (a multilingual ST trained on parallel data or a multilingual BART (mBART) trained on non-parallel multilingual data), we show that adapters can be used to: (a) efficiently specialize ST to specific langua...
Treball de fi de màster en Lingüística Teòrica i Aplicada. Supervisor: Maite Melero NoguesFor many p...
In this paper we present our latest investigation on multilingual bottle-neck (BN) features and its ...
Although machine translation (MT) traditionally pursues “human-oriented” objectives, humans are not ...
Nowadays, training end-to-end neural models for spoken language translation (SLT) still has to confr...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Pre-trained language models received extensive attention in recent years. However, it is still chall...
Multilingual machine translation suffers from negative interference across languages. A common solut...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
Multilingual Neural Machine Translation (MNMT) for low- resource languages (LRL) can be enhanced by ...
End-to-end speech-to-text translation models are often initialized with pre-trained speech encoder a...
Spoken language translation (SLT) exists within one of the most challenging intersections of speech ...
Neural machine translation models have shown to achieve high quality when trained and fed with well ...
Using a mix of shared and language-specific (LS) parameters has shown promise in multilingual neural...
This paper describes Edinburgh’s submissions to the IWSLT2021 multilingual speech translation (ST) t...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
Treball de fi de màster en Lingüística Teòrica i Aplicada. Supervisor: Maite Melero NoguesFor many p...
In this paper we present our latest investigation on multilingual bottle-neck (BN) features and its ...
Although machine translation (MT) traditionally pursues “human-oriented” objectives, humans are not ...
Nowadays, training end-to-end neural models for spoken language translation (SLT) still has to confr...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Pre-trained language models received extensive attention in recent years. However, it is still chall...
Multilingual machine translation suffers from negative interference across languages. A common solut...
Humans benefit from communication but suffer from language barriers. Machine translation (MT) aims t...
Multilingual Neural Machine Translation (MNMT) for low- resource languages (LRL) can be enhanced by ...
End-to-end speech-to-text translation models are often initialized with pre-trained speech encoder a...
Spoken language translation (SLT) exists within one of the most challenging intersections of speech ...
Neural machine translation models have shown to achieve high quality when trained and fed with well ...
Using a mix of shared and language-specific (LS) parameters has shown promise in multilingual neural...
This paper describes Edinburgh’s submissions to the IWSLT2021 multilingual speech translation (ST) t...
Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NM...
Treball de fi de màster en Lingüística Teòrica i Aplicada. Supervisor: Maite Melero NoguesFor many p...
In this paper we present our latest investigation on multilingual bottle-neck (BN) features and its ...
Although machine translation (MT) traditionally pursues “human-oriented” objectives, humans are not ...