In the rapidly growing medium of podcasts, as episodes are automatically transcribed the need for good natural language summarization models which can handle a variety of obstacles presented by the transcriptions and the format has increased. This thesis investigates the transformer-based sequence-to-sequence models, where an attention mechanism keeps track of which words in the context are most important to the next word prediction in the sequence. Different summarization models are investigated on a large-scale open-domain podcast dataset which presents challenges such as transcription errors, multiple speakers, different genres, structures, as well as long texts. The results show that a sparse attention mechanism using a sliding window h...
Compact representations of video data can enable ecient video browsing. Such representations provide...
Automatic summarization is a technique for quickly introducing key information by abbreviating large...
[EN] In this paper, we present an approach to Spanish talk shows summarization. Our approach is base...
In the rapidly growing medium of podcasts, as episodes are automatically transcribed the need for go...
Podcasts have become increasingly popular in recent years, resulting in a massive amount of audio co...
Transformer-based models have achieved state-of-the-art results in a wide range of natural language ...
Transformer-based models have achieved state-of-the-art results in a wide range of natural language ...
Transformer models have achieved state-of-the-art results in a wide range of NLP tasks including sum...
Abstractive summarization is a standard task for written documents, such as news articles. Applying ...
Neural sequence-to-sequence (seq2seq) models have been widely used in abstractive summarization task...
Attention-based encoding and decoding models have been widely used in text abstracts, machine transl...
This manuscript proposes a new benchmark to assess the goodness of visual summaries without the nece...
Sequence-to-sequence models have recently gained the state of the art performance in summarization. ...
As the growth of online data continues, automatic summarization is integral in generating a condens...
Abstract—Compact representations of video data greatly en-hances efficient video browsing. Such repr...
Compact representations of video data can enable ecient video browsing. Such representations provide...
Automatic summarization is a technique for quickly introducing key information by abbreviating large...
[EN] In this paper, we present an approach to Spanish talk shows summarization. Our approach is base...
In the rapidly growing medium of podcasts, as episodes are automatically transcribed the need for go...
Podcasts have become increasingly popular in recent years, resulting in a massive amount of audio co...
Transformer-based models have achieved state-of-the-art results in a wide range of natural language ...
Transformer-based models have achieved state-of-the-art results in a wide range of natural language ...
Transformer models have achieved state-of-the-art results in a wide range of NLP tasks including sum...
Abstractive summarization is a standard task for written documents, such as news articles. Applying ...
Neural sequence-to-sequence (seq2seq) models have been widely used in abstractive summarization task...
Attention-based encoding and decoding models have been widely used in text abstracts, machine transl...
This manuscript proposes a new benchmark to assess the goodness of visual summaries without the nece...
Sequence-to-sequence models have recently gained the state of the art performance in summarization. ...
As the growth of online data continues, automatic summarization is integral in generating a condens...
Abstract—Compact representations of video data greatly en-hances efficient video browsing. Such repr...
Compact representations of video data can enable ecient video browsing. Such representations provide...
Automatic summarization is a technique for quickly introducing key information by abbreviating large...
[EN] In this paper, we present an approach to Spanish talk shows summarization. Our approach is base...