Domain-specific text classification faces the challenge of scarce labeled data due to the high cost of manual labeling. Prompt-learning, known for its efficiency in few-shot scenarios, is proposed as an alternative to traditional fine-tuning methods. And besides, although large language models (LLMs) have gained prominence, small language models (SLMs, with under 1B parameters) offer significant customizability, adaptability, and cost-effectiveness for domain-specific tasks, given industry constraints. In this study, we investigate the potential of SLMs combined with prompt-learning paradigm for domain-specific text classification, specifically within customer-agent interactions in retail. Our evaluations show that, in few-shot settings whe...
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cl...
Prompt-based learning has shown its effectiveness in few-shot text classification. One important fac...
When scaled to hundreds of billions of parameters, pretrained language models such as GPT-3 (Brown e...
Prompt-based fine-tuning has boosted the performance of Pre-trained Language Models (PLMs) on few-sh...
Prompt learning is a new paradigm in the Natural Language Processing (NLP) field which has shown imp...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
Pretrained language models (PLMs) have made remarkable progress in table-to-text generation tasks. H...
Recent few-shot methods, such as parameter-efficient fine-tuning (PEFT) and pattern exploiting train...
When primed with only a handful of training samples, very large, pretrained language models such as ...
We present a new method LiST is short for Lite Prompted Self-Training for parameter-efficient fine-t...
Prompt-based models have gathered a lot of attention from researchers due to their remarkable advanc...
In recent years, there has been significant progress in developing pre-trained language models for N...
Through in-context learning (ICL), large-scale language models are effective few-shot learners witho...
Large-scale pre-trained language models have contributed significantly to natural language processin...
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cl...
Prompt-based learning has shown its effectiveness in few-shot text classification. One important fac...
When scaled to hundreds of billions of parameters, pretrained language models such as GPT-3 (Brown e...
Prompt-based fine-tuning has boosted the performance of Pre-trained Language Models (PLMs) on few-sh...
Prompt learning is a new paradigm in the Natural Language Processing (NLP) field which has shown imp...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
Pretrained language models (PLMs) have made remarkable progress in table-to-text generation tasks. H...
Recent few-shot methods, such as parameter-efficient fine-tuning (PEFT) and pattern exploiting train...
When primed with only a handful of training samples, very large, pretrained language models such as ...
We present a new method LiST is short for Lite Prompted Self-Training for parameter-efficient fine-t...
Prompt-based models have gathered a lot of attention from researchers due to their remarkable advanc...
In recent years, there has been significant progress in developing pre-trained language models for N...
Through in-context learning (ICL), large-scale language models are effective few-shot learners witho...
Large-scale pre-trained language models have contributed significantly to natural language processin...
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cl...
Prompt-based learning has shown its effectiveness in few-shot text classification. One important fac...
When scaled to hundreds of billions of parameters, pretrained language models such as GPT-3 (Brown e...