[[abstract]]The purpose of extractive summarization is to automatically select a number of indicative sentences, passages, or paragraphs from the original document according to a target summarization ratio and then sequence them to form a concise summary. In the paper, we proposed the use of probabilistic latent topical information for extractive summarization of spoken documents. Various kinds of modeling structures and learning approaches were extensively investigated. In addition, the summarization capabilities were verified by comparison with the conventional vector space model and latent semantic indexing model, as well as the HMM model. The experiments were performed on the Chinese broadcast news collected in Taiwan. Noticeable perfor...
Extractive summarization, with the intention of automatically selecting a set of representative sent...
Extractive speech summarization, aiming to automatically select an indicative set of sentences from ...
Text summarization has been one of the most challenging areas of research in NLP. Much effort has be...
This paper considers extractive summarization of Chinese spoken documents. In contrast to convention...
[[abstract]]Huge quantities of multimedia contents including audio and video are continuously growin...
Abstract. The purpose of extractive summarization is to automatically select indicative sentences, p...
eeri ed 17 d b ct a original document according to a target summarization ratio and then sequence th...
The purpose of extractive document summarization is to automatically select a number of indicative s...
Extractive summarization usually automatically selects indicative sentences from a document accordin...
Abstract—The task of extractive speech summarization is to select a set of salient sentences from an...
Abstract—In this paper, we consider extractive summarization of broadcast news speech, and propose a...
Several approaches to automatic speech summarization are discussed below, using the ICSI Meetings co...
In a previous paper [1] two new scoring measures, Topic Signifi-cance (TS) and Topic Entropy (TE), o...
Extractive summarization is intended to automatically select a set of representative sentences from ...
Human-quality text summarization systems are difficult to design, and even more difficult to evaluat...
Extractive summarization, with the intention of automatically selecting a set of representative sent...
Extractive speech summarization, aiming to automatically select an indicative set of sentences from ...
Text summarization has been one of the most challenging areas of research in NLP. Much effort has be...
This paper considers extractive summarization of Chinese spoken documents. In contrast to convention...
[[abstract]]Huge quantities of multimedia contents including audio and video are continuously growin...
Abstract. The purpose of extractive summarization is to automatically select indicative sentences, p...
eeri ed 17 d b ct a original document according to a target summarization ratio and then sequence th...
The purpose of extractive document summarization is to automatically select a number of indicative s...
Extractive summarization usually automatically selects indicative sentences from a document accordin...
Abstract—The task of extractive speech summarization is to select a set of salient sentences from an...
Abstract—In this paper, we consider extractive summarization of broadcast news speech, and propose a...
Several approaches to automatic speech summarization are discussed below, using the ICSI Meetings co...
In a previous paper [1] two new scoring measures, Topic Signifi-cance (TS) and Topic Entropy (TE), o...
Extractive summarization is intended to automatically select a set of representative sentences from ...
Human-quality text summarization systems are difficult to design, and even more difficult to evaluat...
Extractive summarization, with the intention of automatically selecting a set of representative sent...
Extractive speech summarization, aiming to automatically select an indicative set of sentences from ...
Text summarization has been one of the most challenging areas of research in NLP. Much effort has be...