For many tasks, state-of-the-art results have been achieved with Transformer-based architectures, resulting in a paradigmatic shift in practices from the use of task-specific architectures to the fine-tuning of pre-trained language models. The ongoing trend consists in training models with an ever-increasing amount of data and parameters, which requires considerable resources. It leads to a strong search to improve resource efficiency based on algorithmic and hardware improvements evaluated only for English. This raises questions about their usability when applied to small-scale learning problems, for which a limited amount of training data is available, especially for under-resourced languages tasks. The lack of appropriately sized corpora...
In recent years, neural methods for Natural Language Processing (NLP) have consistently and repeated...
The pre-training of large language models usually requires massive amounts of resources, both in ter...
Self-supervised learning (SSL) is at the origin of unprecedented improvements in many different doma...
International audienceFor many tasks, state-of-the-art results have been achieved with Transformer-b...
In the last five years, the rise of the self-attentional Transformer-based architectures led to stat...
International audienceIn the last five years, the rise of the self-attentional Transformerbased arch...
Recent advances in spoken language understanding benefited from Self-Supervised models trained on la...
Modern Natural Language Processing (NLP) models based on Transformer structures represent the state ...
<p>Recent advances in NLP have significantly improved the performance of language models on a ...
National audienceNeural architectures based on self-attention, such as Transformers, recently attrac...
International audienceOver the last five years, transfer approaches using Transformer-like models ha...
The Transformer model is the state-of-the-art in Machine Translation. However and in general and neu...
Web site: https://camembert-model.frPretrained language models are now ubiquitous in Natural Languag...
Some Transformer-based models can perform cross-lingual transfer learning: those models can be train...
Recent advances in spoken language understanding benefited from Self-Supervised models trained on la...
In recent years, neural methods for Natural Language Processing (NLP) have consistently and repeated...
The pre-training of large language models usually requires massive amounts of resources, both in ter...
Self-supervised learning (SSL) is at the origin of unprecedented improvements in many different doma...
International audienceFor many tasks, state-of-the-art results have been achieved with Transformer-b...
In the last five years, the rise of the self-attentional Transformer-based architectures led to stat...
International audienceIn the last five years, the rise of the self-attentional Transformerbased arch...
Recent advances in spoken language understanding benefited from Self-Supervised models trained on la...
Modern Natural Language Processing (NLP) models based on Transformer structures represent the state ...
<p>Recent advances in NLP have significantly improved the performance of language models on a ...
National audienceNeural architectures based on self-attention, such as Transformers, recently attrac...
International audienceOver the last five years, transfer approaches using Transformer-like models ha...
The Transformer model is the state-of-the-art in Machine Translation. However and in general and neu...
Web site: https://camembert-model.frPretrained language models are now ubiquitous in Natural Languag...
Some Transformer-based models can perform cross-lingual transfer learning: those models can be train...
Recent advances in spoken language understanding benefited from Self-Supervised models trained on la...
In recent years, neural methods for Natural Language Processing (NLP) have consistently and repeated...
The pre-training of large language models usually requires massive amounts of resources, both in ter...
Self-supervised learning (SSL) is at the origin of unprecedented improvements in many different doma...