While recent work on multilingual language models has demonstrated their capacity for cross-lingual zero-shot transfer on downstream tasks, there is a lack of consensus in the community as to what shared properties between languages enable such transfer. Analyses involving pairs of natural languages are often inconclusive and contradictory since languages simultaneously differ in many linguistic aspects. In this paper, we perform a large-scale empirical study to isolate the effects of various linguistic properties by measuring zero-shot transfer between four diverse natural languages and their counterparts constructed by modifying aspects such as the script, word order, and syntax. Among other things, our experiments show that the absence o...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Large pretrained multilingual models, trained on dozens of languages, have delivered promising resul...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
It has been shown that multilingual BERT (mBERT) yields high quality multilingual rep- resentations ...
NLP systems typically require support for more than one language. As different languages have differ...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
peer reviewedIn recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong a...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
For many (minority) languages, the resources needed to train large models are not available. We inve...
International audienceKnowledge transfer between neural language models is a widely used technique t...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
Supervised deep learning-based approaches have been applied to task-oriented dialog and have proven ...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
Transfer learning has led to large gains in performance for nearly all NLP tasks while making downst...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Large pretrained multilingual models, trained on dozens of languages, have delivered promising resul...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
It has been shown that multilingual BERT (mBERT) yields high quality multilingual rep- resentations ...
NLP systems typically require support for more than one language. As different languages have differ...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
peer reviewedIn recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong a...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
For many (minority) languages, the resources needed to train large models are not available. We inve...
International audienceKnowledge transfer between neural language models is a widely used technique t...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
Supervised deep learning-based approaches have been applied to task-oriented dialog and have proven ...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
Transfer learning has led to large gains in performance for nearly all NLP tasks while making downst...
For many (minority) languages, the resources needed to train large models are not available. We inve...
Large pretrained multilingual models, trained on dozens of languages, have delivered promising resul...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...