Neural language models often fail to generate diverse and informative texts, limiting their applicability in real-world problems. While previous approaches have proposed to address these issues by identifying and penalizing undesirable behaviors (e.g., repetition, overuse of frequent words) from language models, we propose an alternative approach based on an observation: models primarily learn attributes within examples that are likely to cause degeneration problems. Based on this observation, we propose a new approach to prevent degeneration problems by training two models. Specifically, we first train a model that is designed to amplify undesirable patterns. We then enhance the diversity of the second model by focusing on patterns that th...
Neural Natural Language Generation (NLG) systems are well known for their unreliability. To overcome...
Neural Natural Language Generation (NLG) systems are well known for their unreliability. To overcome...
Automatic generation of text is an important topic in natural language processing with applications ...
Despite considerable advances in neural language modeling, it remains an open question what the best...
Despite considerable advances in neural language modeling, it remains an open question what the best...
Recent studies have determined that the learned token embeddings of large-scale neural language mode...
Text generation is of great importance to many natural language processing applications. However, ma...
Warning: this paper contains model outputs exhibiting offensiveness and biases. Recently pre-trained...
Recent studies have determined that the learned token embeddings of large-scale neural language mode...
While large-scale neural language models, such as GPT2 and BART, have achieved impressive results on...
Neural text generation models that are conditioned on a given input (e.g., machine translation and i...
Neural text generation models that are conditioned on a given input (e.g., machine translation and i...
Language model fine-tuning is essential for modern natural language processing, but is computational...
It has been widely observed that exact or approximate MAP (mode-seeking) decoding from natural langu...
In recent years, the field of language modelling has witnessed exciting developments. Especially, th...
Neural Natural Language Generation (NLG) systems are well known for their unreliability. To overcome...
Neural Natural Language Generation (NLG) systems are well known for their unreliability. To overcome...
Automatic generation of text is an important topic in natural language processing with applications ...
Despite considerable advances in neural language modeling, it remains an open question what the best...
Despite considerable advances in neural language modeling, it remains an open question what the best...
Recent studies have determined that the learned token embeddings of large-scale neural language mode...
Text generation is of great importance to many natural language processing applications. However, ma...
Warning: this paper contains model outputs exhibiting offensiveness and biases. Recently pre-trained...
Recent studies have determined that the learned token embeddings of large-scale neural language mode...
While large-scale neural language models, such as GPT2 and BART, have achieved impressive results on...
Neural text generation models that are conditioned on a given input (e.g., machine translation and i...
Neural text generation models that are conditioned on a given input (e.g., machine translation and i...
Language model fine-tuning is essential for modern natural language processing, but is computational...
It has been widely observed that exact or approximate MAP (mode-seeking) decoding from natural langu...
In recent years, the field of language modelling has witnessed exciting developments. Especially, th...
Neural Natural Language Generation (NLG) systems are well known for their unreliability. To overcome...
Neural Natural Language Generation (NLG) systems are well known for their unreliability. To overcome...
Automatic generation of text is an important topic in natural language processing with applications ...