Transformer-based models have achieved state-of-the-art results in a wide range of natural language processing (NLP) tasks including document summarization. Typically these systems are trained by fine-tuning a large pre-trained model to the target task. One issue with these transformer-based models is that they do not scale well in terms of memory and compute requirements as the input length grows. Thus, for long document summarization, it can be challenging to train or fine-tune these models. In this work, we exploit large pre-trained transformer-based models and address long-span dependencies in abstractive summarization using two methods: local self-attention; and explicit content selection. These approaches are compared on a range of ne...
Summarization is the notion of abstracting key content from information sources. The task of summari...
With the Internet becoming widespread, countless articles and multimedia content have been filled in...
Aiming at the fluency problem of extractive method, the accuracy problem of abstractive method, and ...
Transformer-based models have achieved state-of-the-art results in a wide range of natural language ...
Long documents such as academic articles and business reports have been the standard format to detai...
Transformer models have achieved state-of-the-art results in a wide range of NLP tasks including sum...
Despite the successes of neural attention models for natural language generation tasks, the quadrati...
In this thesis, we propose a novel neural single-document extractive summarization model for long d...
In the rapidly growing medium of podcasts, as episodes are automatically transcribed the need for go...
Major text summarization research is mainly focusing on summarizing short documents and very few wor...
The quadratic memory complexity of transformers prevents long document summarization in low computat...
Automatic text summarization is a method used to compress documents while preserving the main idea o...
Text summarization is a critical Natural Language Processing (NLP) task with applications ranging fr...
Extractive text summarization involves selecting and combining key sentences directly from the origi...
International audienceTransformer deep models have gained lots of attraction in Neural Text Summariz...
Summarization is the notion of abstracting key content from information sources. The task of summari...
With the Internet becoming widespread, countless articles and multimedia content have been filled in...
Aiming at the fluency problem of extractive method, the accuracy problem of abstractive method, and ...
Transformer-based models have achieved state-of-the-art results in a wide range of natural language ...
Long documents such as academic articles and business reports have been the standard format to detai...
Transformer models have achieved state-of-the-art results in a wide range of NLP tasks including sum...
Despite the successes of neural attention models for natural language generation tasks, the quadrati...
In this thesis, we propose a novel neural single-document extractive summarization model for long d...
In the rapidly growing medium of podcasts, as episodes are automatically transcribed the need for go...
Major text summarization research is mainly focusing on summarizing short documents and very few wor...
The quadratic memory complexity of transformers prevents long document summarization in low computat...
Automatic text summarization is a method used to compress documents while preserving the main idea o...
Text summarization is a critical Natural Language Processing (NLP) task with applications ranging fr...
Extractive text summarization involves selecting and combining key sentences directly from the origi...
International audienceTransformer deep models have gained lots of attraction in Neural Text Summariz...
Summarization is the notion of abstracting key content from information sources. The task of summari...
With the Internet becoming widespread, countless articles and multimedia content have been filled in...
Aiming at the fluency problem of extractive method, the accuracy problem of abstractive method, and ...