Some Transformer-based models can perform cross-lingual transfer learning: those models can be trained on a specific task in one language and give relatively good results on the same task in another language, despite having been pre-trained on monolingual tasks only. But, there is no consensus yet on whether those transformer-based models learn universal patterns across languages. We propose a word-level task-agnostic method to evaluate the alignment of contextualized representations built by such models. We show that our method provides more accurate translated word pairs than previous methods to evaluate word-level alignment. And our results show that some inner layers of multilingual Transformer-based models outperform other explicitly a...
This paper describes our submission UFAL MULTIVEC to the WMT16 Quality Estimation Shared Task, for E...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
In this thesis, we present a transformers-based multi-lingual embedding model to represent sentences...
International audienceSome Transformer-based models can perform crosslingual transfer learning: thos...
One of the things that need to change when it comes to machine translation is the models' ability to...
Word alignment which aims to extract lexicon translation equivalents between source and target sente...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
In this paper, we present a thorough investigation on methods that align pre-trained contextualized ...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
Pretrained multilingual text encoders based on neural Transformer architectures, such as multilingua...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...
∗ Both authors contributed equally Cross-language learning allows one to use training data from one ...
Thesis (Master's)--University of Washington, 2020This work presents methods for learning cross-lingu...
This paper describes our submission UFAL MULTIVEC to the WMT16 Quality Estimation Shared Task, for E...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
In this thesis, we present a transformers-based multi-lingual embedding model to represent sentences...
International audienceSome Transformer-based models can perform crosslingual transfer learning: thos...
One of the things that need to change when it comes to machine translation is the models' ability to...
Word alignment which aims to extract lexicon translation equivalents between source and target sente...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
International audienceMultilingual pretrained language models have demonstrated remarkable zero-shot...
In this paper, we present a thorough investigation on methods that align pre-trained contextualized ...
Transformer is a neural machine translation model which revolutionizes machine translation. Compared...
Pretrained multilingual text encoders based on neural Transformer architectures, such as multilingua...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
Cross-lingual semantic parsing transfers parsing capability from a high-resource language (e.g., Eng...
∗ Both authors contributed equally Cross-language learning allows one to use training data from one ...
Thesis (Master's)--University of Washington, 2020This work presents methods for learning cross-lingu...
This paper describes our submission UFAL MULTIVEC to the WMT16 Quality Estimation Shared Task, for E...
The Transformer model is a very recent, fast and powerful discovery in neural machine translation. W...
In this thesis, we present a transformers-based multi-lingual embedding model to represent sentences...