Neural sequence-to-sequence models are finding increasing use in editing of documents, for example in correcting a text document or repairing source code. In this paper, we argue that common seq2seq models (with a facility to copy single tokens) are not a natural fit for such tasks, as they have to explicitly copy each unchanged token. We present an extension of seq2seq models capable of copying entire spans of the input to the output in one step, greatly reducing the number of decisions required during inference. This extension means that there are now many ways of generating the same output, which we handle by deriving a new objective for training and a variation of beam search for inference that explicitly handles this problem. In our ex...
Neural text generation models are typically trained by maximizing log-likelihood with the sequence c...
Conditional set generation learns a mapping from an input sequence of tokens to a set. Several NLP t...
We introduce an online neural sequence to sequence model that learns to alternate between encoding a...
Session 6A: Machine learningWe address an important problem in sequence-to-sequence (Seq2Seq) learni...
Copying mechanism shows effectiveness in sequence-to-sequence based neural network models for text g...
In Natural Language Processing (NLP), it is important to detect the relationship between two sequenc...
Many natural language generation tasks, such as abstractive summarization and text simplification, a...
31st AAAI Conference on Artificial Intelligence, AAAI 2017, San Francisco, CA. USA, 4-10 February 20...
Most existing sequence generation models produce outputs in one pass, usually left-to-right. However...
International audienceWe present Seq2SeqPy a lightweight toolkit for sequence-to-sequence modeling t...
The task of Grammatical Error Correction (GEC) has received remarkable attention with wide applicati...
Pretrained, large, generative language models (LMs) have had great success in a wide range of sequen...
Large-scale neural language models have made impressive strides in natural language generation. Howe...
Natural language reduplication can pose a challenge to neural models of language, and has been argue...
The field of automatic program repair has adapteddeep learning techniques. Sequence to sequence neur...
Neural text generation models are typically trained by maximizing log-likelihood with the sequence c...
Conditional set generation learns a mapping from an input sequence of tokens to a set. Several NLP t...
We introduce an online neural sequence to sequence model that learns to alternate between encoding a...
Session 6A: Machine learningWe address an important problem in sequence-to-sequence (Seq2Seq) learni...
Copying mechanism shows effectiveness in sequence-to-sequence based neural network models for text g...
In Natural Language Processing (NLP), it is important to detect the relationship between two sequenc...
Many natural language generation tasks, such as abstractive summarization and text simplification, a...
31st AAAI Conference on Artificial Intelligence, AAAI 2017, San Francisco, CA. USA, 4-10 February 20...
Most existing sequence generation models produce outputs in one pass, usually left-to-right. However...
International audienceWe present Seq2SeqPy a lightweight toolkit for sequence-to-sequence modeling t...
The task of Grammatical Error Correction (GEC) has received remarkable attention with wide applicati...
Pretrained, large, generative language models (LMs) have had great success in a wide range of sequen...
Large-scale neural language models have made impressive strides in natural language generation. Howe...
Natural language reduplication can pose a challenge to neural models of language, and has been argue...
The field of automatic program repair has adapteddeep learning techniques. Sequence to sequence neur...
Neural text generation models are typically trained by maximizing log-likelihood with the sequence c...
Conditional set generation learns a mapping from an input sequence of tokens to a set. Several NLP t...
We introduce an online neural sequence to sequence model that learns to alternate between encoding a...