State-of-the-art neural (re)rankers are notoriously data hungry which - given the lack of large-scale training data in languages other than English - makes them rarely used in multilingual and cross-lingual retrieval settings. Current approaches therefore typically transfer rankers trained on English data to other languages and cross-lingual setups by means of multilingual encoders: they fine-tune all the parameters of a pretrained massively multilingual Transformer (MMT, e.g., multilingual BERT) on English relevance judgments and then deploy it in the target language. In this work, we show that two parameter-efficient approaches to cross-lingual transfer, namely Sparse Fine-Tuning Masks (SFTMs) and Adapters, allow for a more lightweight an...
The recently proposed massively multilingual neural machine translation (NMT) system has been shown ...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
We investigate adaptation of a supervised machine learning model for reranking of query translation...
<p>With the rapid growth of world-wide information accessibility, cross-language information retriev...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
There are significant efforts toward developing better neural approaches for information retrieval p...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
We present an approach to learning bilingual n-gram correspondences from relevance rankings of Engli...
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
Today's amount of user-generated, multilingual textual data generates the necessity for information ...
Neural Machine Translation (NMT) has been shown to be more effective in translation tasks compared t...
Fine-tuning the entire set of parameters of a large pretrained model has become the mainstream appro...
Current state-of-the-art approaches to cross- modal retrieval process text and visual input jointly,...
Machine Translation (MT) systems employed to translate queries for Cross-Lingual Information Retriev...
The recently proposed massively multilingual neural machine translation (NMT) system has been shown ...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...
We investigate adaptation of a supervised machine learning model for reranking of query translation...
<p>With the rapid growth of world-wide information accessibility, cross-language information retriev...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
There are significant efforts toward developing better neural approaches for information retrieval p...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
We present an approach to learning bilingual n-gram correspondences from relevance rankings of Engli...
With the advent of deep neural networks in recent years, Neural Machine Translation (NMT) systems ha...
Today's amount of user-generated, multilingual textual data generates the necessity for information ...
Neural Machine Translation (NMT) has been shown to be more effective in translation tasks compared t...
Fine-tuning the entire set of parameters of a large pretrained model has become the mainstream appro...
Current state-of-the-art approaches to cross- modal retrieval process text and visual input jointly,...
Machine Translation (MT) systems employed to translate queries for Cross-Lingual Information Retriev...
The recently proposed massively multilingual neural machine translation (NMT) system has been shown ...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
Large pre-trained multilingual models such as mBERT and XLM-R enabled effective cross-lingual zero-s...