Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/55395
Title: Learning summary prior representation for extractive summarization
Authors: Cao, Z
Wei, F
Li, S
Li, W 
Zhou, M
Wang, H
Issue Date: 2015
Publisher: Association for Computational Linguistics (ACL)
Source: ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference, 26-31 July 2015, v. 2, p. 829-833 How to cite?
Abstract: In this paper, we propose the concept of summary prior to define how much a sentence is appropriate to be selected into summary without consideration of its context. Different from previous work using manually compiled documentindependent features, we develop a novel summary system called PriorSum, which applies the enhanced convolutional neural networks to capture the summary prior features derived from length-variable phrases. Under a regression framework, the learned prior features are concatenated with document-dependent features for sentence ranking. Experiments on the DUC generic summarization benchmarks show that PriorSum can discover different aspects supporting the summary prior and outperform state-of-the-art baselines.
URI: http://hdl.handle.net/10397/55395
ISBN: 9781941643730
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

16
Last Week
0
Last month
Citations as of May 31, 2018

Page view(s)

72
Last Week
0
Last month
Citations as of Jun 18, 2018

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.