Text-editing models have recently become a prominent alternative to seq2seq models for monolingual text-generation tasks such as grammatical error correction, simplification, and style transfer. These tasks share a common trait - they exhibit a large amount of textual overlap between the source and target texts. Text-editing models take advantage of this observation and learn to generate the output by predicting edit operations applied to the source sequence. In contrast, seq2seq models generate outputs word-by-word from scratch thus making them slow at inference time. Text-editing models provide several benefits over seq2seq models including faster inference speed, higher sample efficiency, and better control and interpretability of the ou...
In this paper, we study the task of improving the cohesion and coherence of long-form text generated...
Controllable generative sequence models with the capability to extract and replicate the style of sp...
Neural text generation models are typically trained by maximizing log-likelihood with the sequence c...
We present EdiTTS, an off-the-shelf speech editing methodology based on score-based generative model...
We present EdiT5 - a novel semi-autoregressive text-editing approach designed to combine the strengt...
Users interact with text, image, code, or other editors on a daily basis. However, machine learning ...
Language-aware text editing While software developers have various power tools at their disposal tha...
We present BLESS, a comprehensive performance benchmark of the most recent state-of-the-art Large La...
Text-driven image generation methods have shown impressive results recently, allowing casual users t...
To generate good text, many kinds of decisions should be made. Many researchers have spent much time...
Writing is, by nature, a strategic, adaptive, and more importantly, an iterative process. A crucial ...
Text revision refers to a family of natural language generation tasks, where the source and target s...
Automatic summarization methods are efficient but can suffer from low quality. In comparison, manual...
Large-scale neural language models have made impressive strides in natural language generation. Howe...
Text Generation aims to produce plausible and readable text in a human language from input data. The...
In this paper, we study the task of improving the cohesion and coherence of long-form text generated...
Controllable generative sequence models with the capability to extract and replicate the style of sp...
Neural text generation models are typically trained by maximizing log-likelihood with the sequence c...
We present EdiTTS, an off-the-shelf speech editing methodology based on score-based generative model...
We present EdiT5 - a novel semi-autoregressive text-editing approach designed to combine the strengt...
Users interact with text, image, code, or other editors on a daily basis. However, machine learning ...
Language-aware text editing While software developers have various power tools at their disposal tha...
We present BLESS, a comprehensive performance benchmark of the most recent state-of-the-art Large La...
Text-driven image generation methods have shown impressive results recently, allowing casual users t...
To generate good text, many kinds of decisions should be made. Many researchers have spent much time...
Writing is, by nature, a strategic, adaptive, and more importantly, an iterative process. A crucial ...
Text revision refers to a family of natural language generation tasks, where the source and target s...
Automatic summarization methods are efficient but can suffer from low quality. In comparison, manual...
Large-scale neural language models have made impressive strides in natural language generation. Howe...
Text Generation aims to produce plausible and readable text in a human language from input data. The...
In this paper, we study the task of improving the cohesion and coherence of long-form text generated...
Controllable generative sequence models with the capability to extract and replicate the style of sp...
Neural text generation models are typically trained by maximizing log-likelihood with the sequence c...