Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105492
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Computing | - |
| dc.creator | Yuan, R | - |
| dc.creator | Wang, Z | - |
| dc.creator | Li, W | - |
| dc.date.accessioned | 2024-04-15T07:34:41Z | - |
| dc.date.available | 2024-04-15T07:34:41Z | - |
| dc.identifier.isbn | 978-1-952148-27-9 | - |
| dc.identifier.uri | http://hdl.handle.net/10397/105492 | - |
| dc.description | 28th International Conference on Computational Linguistics, December 8-13, 2020, Barcelona, Spain (Online) | en_US |
| dc.language.iso | en | en_US |
| dc.publisher | Association for Computational Linguistics (ACL) | en_US |
| dc.rights | This work is licensed under a Creative Commons Attribution 4.0 International License. License details: http://creativecommons.org/licenses/by/4.0/. | en_US |
| dc.rights | The following publication Ruifeng Yuan, Zili Wang, and Wenjie Li. 2020. Fact-level Extractive Summarization with Hierarchical Graph Mask on BERT. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5629–5639, Barcelona, Spain (Online). International Committee on Computational Linguistics is available at https://doi.org/10.18653/v1/2020.coling-main.493. | en_US |
| dc.title | Fact-level extractive summarization with hierarchical graph mask on BERT | en_US |
| dc.type | Conference Paper | en_US |
| dc.identifier.spage | 5629 | - |
| dc.identifier.epage | 5639 | - |
| dc.identifier.doi | 10.18653/v1/2020.coling-main.493 | - |
| dcterms.abstract | Most current extractive summarization models generate summaries by selecting salient sentences. However, one of the problems with sentence-level extractive summarization is that there exists a gap between the human-written gold summary and the oracle sentence labels. In this paper, we propose to extract fact-level semantic units for better extractive summarization. We also introduce a hierarchical structure, which incorporates the multi-level of granularities of the textual information into the model. In addition, we incorporate our model with BERT using a hierarchical graph mask. This allows us to combine BERT’s ability in natural language understanding and the structural information without increasing the scale of the model. Experiments on the CNN/DaliyMail dataset show that our model achieves state-of-the-art results. | - |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | In Proceedings of the 28th International Conference on Computational Linguistics, p. 5629-5639. Barcelona, Spain : International Committee on Computational Linguistics, 2020 | - |
| dcterms.issued | 2020 | - |
| dc.relation.ispartofbook | Proceedings of the 28th International Conference on Computational Linguistics | - |
| dc.relation.conference | International Conference on Computational Linguistics [COLING] | - |
| dc.description.validate | 202402 bcch | - |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | COMP-0157 | en_US |
| dc.description.fundingSource | RGC | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | National Natural Science Foundation of China | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.identifier.OPUS | 49977398 | en_US |
| dc.description.oaCategory | CC | en_US |
| Appears in Collections: | Conference Paper | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 2020.coling-main.493.pdf | 304.66 kB | Adobe PDF | View/Open |
Page views
95
Last Week
5
5
Last month
Citations as of Nov 30, 2025
Downloads
26
Citations as of Nov 30, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



