Please use this identifier to cite or link to this item:
Title: Learning summary prior representation for extractive summarization
Authors: Cao, Z
Wei, F
Li, S
Li, W 
Zhou, M
Wang, H
Issue Date: 2015
Publisher: Association for Computational Linguistics (ACL)
Source: ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference, 26-31 July 2015, v. 2, p. 829-833 How to cite?
Abstract: In this paper, we propose the concept of summary prior to define how much a sentence is appropriate to be selected into summary without consideration of its context. Different from previous work using manually compiled documentindependent features, we develop a novel summary system called PriorSum, which applies the enhanced convolutional neural networks to capture the summary prior features derived from length-variable phrases. Under a regression framework, the learned prior features are concatenated with document-dependent features for sentence ranking. Experiments on the DUC generic summarization benchmarks show that PriorSum can discover different aspects supporting the summary prior and outperform state-of-the-art baselines.
ISBN: 9781941643730
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Nov 29, 2018

Page view(s)

Last Week
Last month
Citations as of Dec 10, 2018

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.