Pretrained language models (PLMs) have demonstrated remarkable performance in various natural language processing tasks: Unidirectional PLMs (e.g., GPT) are well known for their superior text generation capabilities; bidirectional PLMs (e.g., BERT) have been the prominent choice for natural language understanding (NLU) tasks. While both types of models have achieved promising few-shot learning performance, their potential for zero-shot learning has been underexplored. In this paper, we present a simple approach that uses both types of PLMs for fully zero-shot learning of NLU tasks without requiring any task-specific data: A unidirectional PLM generates class-conditioned texts guided by prompts, which are used as the training data for fine-t...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
Traditional text classification approaches often require a good amount of labeled data, which is dif...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
Nowadays, owing to the superior capacity of the large pre-trained language models (PLM), the PLM-bas...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
There is a growing interest in dataset generation recently due to the superior generative capacity o...
One of the most impressive results of recent NLP history is the ability of pre-trained language mode...
The task of data-to-text generation amounts to describing structured data, such as RDF triples, in f...
Text Generation aims to produce plausible and readable text in a human language from input data. The...
The task of data-to-text generation amounts to describing structured data in fluent natural language...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
Recent works have shown promising results of prompt tuning in stimulating pre-trained language model...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
Traditional text classification approaches often require a good amount of labeled data, which is dif...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
Nowadays, owing to the superior capacity of the large pre-trained language models (PLM), the PLM-bas...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
There is a growing interest in dataset generation recently due to the superior generative capacity o...
One of the most impressive results of recent NLP history is the ability of pre-trained language mode...
The task of data-to-text generation amounts to describing structured data, such as RDF triples, in f...
Text Generation aims to produce plausible and readable text in a human language from input data. The...
The task of data-to-text generation amounts to describing structured data in fluent natural language...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
Recent works have shown promising results of prompt tuning in stimulating pre-trained language model...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
International audienceLarge language models have recently been shown to attain reasonable zero-shot ...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
Traditional text classification approaches often require a good amount of labeled data, which is dif...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...