Session 6A: Machine learningWe address an important problem in sequence-to-sequence (Seq2Seq) learning referred to as copying, in which certain segments in the input sequence are selectively replicated in the output sequence. A similar phenomenon is observable in human language communication. For example, humans tend to repeat entity names or even long phrases in conversation. The challenge with regard to copying in Seq2Seq is that new machinery is needed to decide when to perform the operation. In this paper, we incorporate copying into neural networkbased Seq2Seq learning and propose a new model called COPYNET with encoderdecoder structure. COPYNET can nicely integrate the regular way of word generation in the decoder with the new copyin...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Deep (recurrent) neural networks has been shown to successfully learn complex mappings between arbit...
We develop a precise writing survey on sequence-to-sequence learning with neural network and its mod...
Copying mechanism shows effectiveness in sequence-to-sequence based neural network models for text g...
Neural sequence-to-sequence models are finding increasing use in editing of documents, for example i...
31st AAAI Conference on Artificial Intelligence, AAAI 2017, San Francisco, CA. USA, 4-10 February 20...
Many natural language generation tasks, such as abstractive summarization and text simplification, a...
In Natural Language Processing (NLP), it is important to detect the relationship between two sequenc...
Many machine learning tasks can be ex-pressed as the transformation—or transduc-tion—of input sequen...
The field of automatic program repair has adapteddeep learning techniques. Sequence to sequence neur...
Deep Neural Networks (DNNs) are powerful models that have achieved excel-lent performance on difficu...
This paper presents results from a series of simulations that attempted to teach a vanilla sequence-...
In encoder-decoder based sequence-to-sequence modeling, the most common practice is to stack a numbe...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
Most machine learning algorithms require a fixed length input to be able to perform commonly desired...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Deep (recurrent) neural networks has been shown to successfully learn complex mappings between arbit...
We develop a precise writing survey on sequence-to-sequence learning with neural network and its mod...
Copying mechanism shows effectiveness in sequence-to-sequence based neural network models for text g...
Neural sequence-to-sequence models are finding increasing use in editing of documents, for example i...
31st AAAI Conference on Artificial Intelligence, AAAI 2017, San Francisco, CA. USA, 4-10 February 20...
Many natural language generation tasks, such as abstractive summarization and text simplification, a...
In Natural Language Processing (NLP), it is important to detect the relationship between two sequenc...
Many machine learning tasks can be ex-pressed as the transformation—or transduc-tion—of input sequen...
The field of automatic program repair has adapteddeep learning techniques. Sequence to sequence neur...
Deep Neural Networks (DNNs) are powerful models that have achieved excel-lent performance on difficu...
This paper presents results from a series of simulations that attempted to teach a vanilla sequence-...
In encoder-decoder based sequence-to-sequence modeling, the most common practice is to stack a numbe...
Flexible neural sequence models outperform grammar- and automaton-based counterparts on a variety of...
Most machine learning algorithms require a fixed length input to be able to perform commonly desired...
2018-08-01Recurrent neural networks (RNN) have been successfully applied to various Natural Language...
Deep (recurrent) neural networks has been shown to successfully learn complex mappings between arbit...
We develop a precise writing survey on sequence-to-sequence learning with neural network and its mod...