Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105724
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Computing | en_US |
dc.creator | Cao, Z | en_US |
dc.creator | Li, W | en_US |
dc.creator | Li, S | en_US |
dc.creator | Wei, F | en_US |
dc.creator | Li, Y | en_US |
dc.date.accessioned | 2024-04-15T07:36:14Z | - |
dc.date.available | 2024-04-15T07:36:14Z | - |
dc.identifier.isbn | 978-4-87974-702-0 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/105724 | - |
dc.description | 26th International Conference on Computational Linguistics, December 11-16, 2016, Osaka, Japan | en_US |
dc.language.iso | en | en_US |
dc.publisher | Association for Computational Linguistics (ACL) | en_US |
dc.rights | Copyright of each paper stays with the respective authors (or their employers). | en_US |
dc.rights | Posted with permission of the author. | en_US |
dc.rights | The following publication Ziqiang Cao, Wenjie Li, Sujian Li, Furu Wei, and Yanran Li. 2016. AttSum: Joint Learning of Focusing and Summarization with Neural Attention. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 547–556, Osaka, Japan. The COLING 2016 Organizing Committee is available at https://aclanthology.org/C16-1053/. | en_US |
dc.title | AttSum : joint learning of focusing and summarization with neural attention | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 547 | en_US |
dc.identifier.epage | 556 | en_US |
dcterms.abstract | Query relevance ranking and sentence saliency ranking are the two main tasks in extractive query-focused summarization. Previous supervised summarization systems often perform the two tasks in isolation. However, since reference summaries are the trade-off between relevance and saliency, using them as supervision, neither of the two rankers could be trained well. This paper proposes a novel summarization system called AttSum, which tackles the two tasks jointly. It automatically learns distributed representations for sentences as well as the document cluster. Meanwhile, it applies the attention mechanism to simulate the attentive reading of human behavior when a query is given. Extensive experiments are conducted on DUC query-focused summarization benchmark datasets. Without using any hand-crafted features, AttSum achieves competitive performance. We also observe that the sentences recognized to focus on the query indeed meet the query need. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In The 26th International Conference on Computational Linguistics: Proceedings of COLING 2016: Technical Papers, p. 547-556 | en_US |
dcterms.issued | 2016 | - |
dc.relation.ispartofbook | The 26th International Conference on Computational Linguistics: Proceedings of COLING 2016: Technical Papers | en_US |
dc.relation.conference | International Conference on Computational Linguistics [COLING] | en_US |
dc.description.validate | 202402 bcch | en_US |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | COMP-1599 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | National Natural Science Foundation of China; The Hong Kong Polytechnic University | en_US |
dc.description.pubStatus | Published | en_US |
dc.identifier.OPUS | 19996042 | - |
dc.description.oaCategory | Copyright retained by author | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
C16-1053.pdf | 224.86 kB | Adobe PDF | View/Open |
Page views
15
Citations as of May 19, 2024
Downloads
2
Citations as of May 19, 2024
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.