In this paper, we explore the challenging problem of performing a generative task (i.e., summarization) in a target language when labeled data is only available in English. We assume a strict setting with no access to parallel data or machine translation. Prior work has shown, and we confirm, that standard transfer learning techniques struggle in this setting, as a generative multilingual model fine-tuned purely on English catastrophically forgets how to generate non-English. Given the recent rise of parameter-efficient adaptation techniques (e.g., prompt tuning), we conduct the first investigation into how well these methods can overcome catastrophic forgetting to enable zero-shot cross-lingual generation. We find that parameter-efficient ...
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Tr...
As numerous modern NLP models demonstrate high-performance in various tasks when trained with resour...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
Zero-shot translation is a transfer learning setup that refers to the ability of neural machine tran...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
An important concern in training multilingual neural machine translation (NMT) is to translate betwe...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Recently, data-driven task-oriented dialogue systems have achieved promising performance in English....
Recent work on multilingual neural machine translation reported competitive performance with respect...
Prompt-based tuning has been proven effective for pretrained language models (PLMs). While most of t...
Large-scale cross-lingual language models (LM), such as mBERT, Unicoder and XLM, have achieved great...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Tr...
As numerous modern NLP models demonstrate high-performance in various tasks when trained with resour...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...
Zero-shot translation is a transfer learning setup that refers to the ability of neural machine tran...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
An important concern in training multilingual neural machine translation (NMT) is to translate betwe...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Adapter modules have emerged as a general parameter-efficient means to specialize a pretrained encod...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Recently, data-driven task-oriented dialogue systems have achieved promising performance in English....
Recent work on multilingual neural machine translation reported competitive performance with respect...
Prompt-based tuning has been proven effective for pretrained language models (PLMs). While most of t...
Large-scale cross-lingual language models (LM), such as mBERT, Unicoder and XLM, have achieved great...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Transfer learning between different language pairs has shown its effectiveness for Neural Machine Tr...
As numerous modern NLP models demonstrate high-performance in various tasks when trained with resour...
Some natural languages belong to the same family or share similar syntactic and/or semantic regulari...