Controllable text generation has taken a gigantic step forward these days. Yet existing methods are either constrained in a one-off pattern or not efficient enough for receiving multiple conditions at every generation stage. We propose a model-agnostic framework Plug-in Conditional Auto-Encoder for Controllable Text Generation (PCAE) towards flexible and semi-supervised text generation. Our framework is "plug-and-play" with partial parameters to be fine-tuned in the pre-trained model (less than a half). Crucial to the success of PCAE is the proposed broadcasting label fusion network for navigating the global latent code to a specified local and confined space. Visualization of the local latent prior well confirms the primary devotion in hid...
To learn text understanding models with millions of parameters one needs massive amounts of data. In...
Neural network based methods are widely used in text generation. The end-to-end training of neural n...
Text Generation aims to produce plausible and readable text in a human language from input data. The...
Pretrained Transformer-based language models (LMs) display remarkable natural language generation ca...
Deep neural networks have recently achieved remarkable empirical success in text generation tasks. U...
Controllable Text Generation (CTG) is emerging area in the field of natural language generation (NLG...
Large pre-trained language models have repeatedly shown their ability to produce fluent text. Yet ev...
Controllable text generation systems often leverage control codes to direct various properties of th...
Prefix-tuning is a powerful lightweight technique for adapting a large pre-trained language model to...
This paper studies the use of language models as a source of synthetic unlabeled text for NLP. We fo...
We propose a general and efficient framework to control auto-regressive generation models with NeurA...
Controllable text generation (CTG) aims to generate text with desired attributes, and decoding-time-...
Large language models (LM) based on Transformers allow to generate plausible long texts. In this pap...
Natural language generation from structured data mainly focuses on surface-level descriptions, suffe...
Current efficient fine-tuning methods (e.g., adapters, prefix-tuning, etc.) have optimized condition...
To learn text understanding models with millions of parameters one needs massive amounts of data. In...
Neural network based methods are widely used in text generation. The end-to-end training of neural n...
Text Generation aims to produce plausible and readable text in a human language from input data. The...
Pretrained Transformer-based language models (LMs) display remarkable natural language generation ca...
Deep neural networks have recently achieved remarkable empirical success in text generation tasks. U...
Controllable Text Generation (CTG) is emerging area in the field of natural language generation (NLG...
Large pre-trained language models have repeatedly shown their ability to produce fluent text. Yet ev...
Controllable text generation systems often leverage control codes to direct various properties of th...
Prefix-tuning is a powerful lightweight technique for adapting a large pre-trained language model to...
This paper studies the use of language models as a source of synthetic unlabeled text for NLP. We fo...
We propose a general and efficient framework to control auto-regressive generation models with NeurA...
Controllable text generation (CTG) aims to generate text with desired attributes, and decoding-time-...
Large language models (LM) based on Transformers allow to generate plausible long texts. In this pap...
Natural language generation from structured data mainly focuses on surface-level descriptions, suffe...
Current efficient fine-tuning methods (e.g., adapters, prefix-tuning, etc.) have optimized condition...
To learn text understanding models with millions of parameters one needs massive amounts of data. In...
Neural network based methods are widely used in text generation. The end-to-end training of neural n...
Text Generation aims to produce plausible and readable text in a human language from input data. The...