Text classification is one of the most imperative tasks in natural language processing (NLP). Recent advances with pre-trained language models (PLMs) have shown remarkable success on this task. However, the satisfying results obtained by PLMs heavily depend on the large amounts of task-specific labeled data, which may not be feasible in many application scenarios due to data access and privacy constraints. The recently-proposed prompt-based fine-tuning paradigm improves the performance of PLMs for few-shot text classification with task-specific templates. Yet, it is unclear how the prompting knowledge can be transferred across tasks, for the purpose of mutual reinforcement. We propose TransPrompt v2, a novel transferable prompting framework...
The current generation of neural network-based natural language processing models excels at learning...
International audienceSupervised deep learning-based approaches have been applied to task-oriented d...
Speech representations learned from Self-supervised learning (SSL) models can benefit various speech...
Prompt-based fine-tuning has boosted the performance of Pre-trained Language Models (PLMs) on few-sh...
Pretrained language models (PLMs) have made remarkable progress in text generation tasks via fine-tu...
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on...
Transfer learning from large language models (LLMs) has emerged as a powerful technique to enable kn...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
It has been shown for English that discrete and soft prompting perform strongly in fewshot learning ...
Supervised deep learning-based approaches have been applied to task-oriented dialog and have proven ...
Providing pretrained language models with simple task descriptions in natural language enables them ...
Structure prediction (SP) tasks are important in natural language understanding in the sense that th...
Multi-Task Learning (MTL) models have shown their robustness, effectiveness, and efficiency for tran...
In recent years, there has been significant progress in developing pre-trained language models for N...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
The current generation of neural network-based natural language processing models excels at learning...
International audienceSupervised deep learning-based approaches have been applied to task-oriented d...
Speech representations learned from Self-supervised learning (SSL) models can benefit various speech...
Prompt-based fine-tuning has boosted the performance of Pre-trained Language Models (PLMs) on few-sh...
Pretrained language models (PLMs) have made remarkable progress in text generation tasks via fine-tu...
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on...
Transfer learning from large language models (LLMs) has emerged as a powerful technique to enable kn...
Pre-trained multilingual language models show significant performance gains for zero-shot cross-ling...
It has been shown for English that discrete and soft prompting perform strongly in fewshot learning ...
Supervised deep learning-based approaches have been applied to task-oriented dialog and have proven ...
Providing pretrained language models with simple task descriptions in natural language enables them ...
Structure prediction (SP) tasks are important in natural language understanding in the sense that th...
Multi-Task Learning (MTL) models have shown their robustness, effectiveness, and efficiency for tran...
In recent years, there has been significant progress in developing pre-trained language models for N...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
The current generation of neural network-based natural language processing models excels at learning...
International audienceSupervised deep learning-based approaches have been applied to task-oriented d...
Speech representations learned from Self-supervised learning (SSL) models can benefit various speech...