Supervised deep learning-based approaches have been applied to task-oriented dialog and have proven to be effective for limited domain and language applications when a sufficient number of training examples are available. In practice, these approaches suffer from the drawbacks of domain-driven design and under-resourced languages. Domain and language models are supposed to grow and change as the problem space evolves. On one hand, research on transfer learning has demonstrated the cross-lingual ability of multilingual Transformers-based models to learn semantically rich representations. On the other, in addition to the above approaches, meta-learning have enabled the development of task and language learning algorithms capable of far genera...
The current generation of neural network-based natural language processing models excels at learning...
Transfer learning from large language models (LLMs) has emerged as a powerful technique to enable kn...
Large-scale annotated datasets are an indispensable ingredient of modern Natural Language Processing...
International audienceSupervised deep learning-based approaches have been applied to task-oriented d...
Transfer learning has led to large gains in performance for nearly all NLP tasks while making downst...
Pretrained multilingual language models have become a common tool in transferring NLP capabilities t...
Large-scale generative language models such as GPT-3 are competitive few-shot learners. While these ...
International audienceTransfer learning based on pretraining language models on a large amount of ra...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on...
Recently, there has been an increasing interest in models that generate natural language explanation...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
State-of-the-art approaches to most Natural Language Processing (NLP) tasks have achieved near huma...
The current generation of neural network-based natural language processing models excels at learning...
Transfer learning from large language models (LLMs) has emerged as a powerful technique to enable kn...
Large-scale annotated datasets are an indispensable ingredient of modern Natural Language Processing...
International audienceSupervised deep learning-based approaches have been applied to task-oriented d...
Transfer learning has led to large gains in performance for nearly all NLP tasks while making downst...
Pretrained multilingual language models have become a common tool in transferring NLP capabilities t...
Large-scale generative language models such as GPT-3 are competitive few-shot learners. While these ...
International audienceTransfer learning based on pretraining language models on a large amount of ra...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on...
Recently, there has been an increasing interest in models that generate natural language explanation...
Recent work has shown that neural models canbe successfully trained on multiple languagessimultaneou...
Pre-trained multilingual language models play an important role in cross-lingual natural language un...
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective appro...
State-of-the-art approaches to most Natural Language Processing (NLP) tasks have achieved near huma...
The current generation of neural network-based natural language processing models excels at learning...
Transfer learning from large language models (LLMs) has emerged as a powerful technique to enable kn...
Large-scale annotated datasets are an indispensable ingredient of modern Natural Language Processing...