Data-to-text generation systems aim to generate text descriptions based on input data (often represented in the tabular form). A typical system uses huge training samples for learning the correspondence between tables and texts. However, large training sets are expensive to obtain, limiting the applicability of these approaches in real-world scenarios. In this work, we focus on few-shot data-to-text generation. We observe that, while fine-tuned pretrained language models may generate plausible sentences, they suffer from the low semantic coverage problem in the few-shot setting. In other words, important input slots tend to be missing in the generated text. To this end, we propose a search-and-learning approach that leverages pretrained lan...
To learn text understanding models with millions of parameters one needs massive amounts of data. In...
Recent neural models have shown significant progress on the problem of generating short descriptive ...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
In data-to-text (D2T) generation, training on in-domain data leads to overfitting to the data repres...
Pretrained language models (PLMs) have made remarkable progress in table-to-text generation tasks. H...
In recent years, deep learning has made substantial improvements in various fields like image unders...
The task of data-to-text generation amounts to describing structured data, such as RDF triples, in f...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
Pretraining deep neural networks to perform language modeling - that is, to reconstruct missing word...
Large pre-trained language models have repeatedly shown their ability to produce fluent text. Yet ev...
Based on recent advances in natural language modeling and those in text generation capabilities, we ...
Providing pretrained language models with simple task descriptions in natural language enables them ...
During the last decade, machine learning techniques have been used successfully in many applications...
In many cases of machine learning, research suggests that the development of training data might hav...
In real-world applications of natural language generation, target sentences are often required to sa...
To learn text understanding models with millions of parameters one needs massive amounts of data. In...
Recent neural models have shown significant progress on the problem of generating short descriptive ...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...
In data-to-text (D2T) generation, training on in-domain data leads to overfitting to the data repres...
Pretrained language models (PLMs) have made remarkable progress in table-to-text generation tasks. H...
In recent years, deep learning has made substantial improvements in various fields like image unders...
The task of data-to-text generation amounts to describing structured data, such as RDF triples, in f...
In recent years, the community of natural language processing (NLP) has seen amazing progress in the...
Pretraining deep neural networks to perform language modeling - that is, to reconstruct missing word...
Large pre-trained language models have repeatedly shown their ability to produce fluent text. Yet ev...
Based on recent advances in natural language modeling and those in text generation capabilities, we ...
Providing pretrained language models with simple task descriptions in natural language enables them ...
During the last decade, machine learning techniques have been used successfully in many applications...
In many cases of machine learning, research suggests that the development of training data might hav...
In real-world applications of natural language generation, target sentences are often required to sa...
To learn text understanding models with millions of parameters one needs massive amounts of data. In...
Recent neural models have shown significant progress on the problem of generating short descriptive ...
Recent advances on large pre-trained language models (PLMs) lead impressive gains on natural languag...