NLP systems typically require support for more than one language. As different languages have different amounts of supervision, cross-lingual transfer benefits languages with little to no training data by transferring from other languages. From an engineering perspective, multilingual NLP benefits development and maintenance by serving multiple languages with a single system. Both cross-lingual transfer and multilingual NLP rely on cross-lingual representations serving as the foundation. As BERT revolutionized representation learning and NLP, it also revolutionized cross-lingual representations and cross-lingual transfer. Multilingual BERT was released as a replacement for single-language BERT, trained with Wikipedia data in 104 languages. ...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
peer reviewedIn recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong a...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Abstract: The subject area of multilingual natural language processing (NLP) is concerned with the p...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Large-scale annotated datasets are an indispensable ingredient of modern Natural Language Processing...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Accepted at EACL 2021Multilingual pretrained language models have demonstrated remarkable zero-shot ...
While recent work on multilingual language models has demonstrated their capacity for cross-lingual ...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
peer reviewedIn recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong a...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Abstract: The subject area of multilingual natural language processing (NLP) is concerned with the p...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Traditional approaches to supervised learning require a generous amount of labeled data for good gen...
Large-scale annotated datasets are an indispensable ingredient of modern Natural Language Processing...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Accepted at EACL 2021Multilingual pretrained language models have demonstrated remarkable zero-shot ...
While recent work on multilingual language models has demonstrated their capacity for cross-lingual ...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
peer reviewedIn recent years, pre-trained Multilingual Language Models (MLLMs) have shown a strong a...