Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105450
Title: | Highlight-transformer : leveraging key phrase aware attention to improve abstractive multi-document summarization | Authors: | Liu, S Cao, J Yang, R Wen, Z |
Issue Date: | 2021 | Source: | In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, p. 5021-5027. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2021 | Abstract: | Abstractive multi-document summarization aims to generate a comprehensive summary covering salient content from multiple input documents. Compared with previous RNN-based models, the Transformer-based models employ the self-attention mechanism to capture the dependencies in input documents and can generate better summaries. Existing works have not considered key phrases in determining attention weights of self-attention. Consequently, some of the tokens within key phrases only receive small attention weights. It can affect completely encoding key phrases that convey the salient ideas of input documents. In this paper, we introduce the Highlight-Transformer, a model with the highlighting mechanism in the encoder to assign greater attention weights for the tokens within key phrases. We propose two structures of high-lighting attention for each head and the multi-head highlighting attention. The experimental results on the Multi-News dataset show that our proposed model significantly outperforms the competitive baseline models. | Publisher: | Association for Computational Linguistics (ACL) | ISBN: | 978-1-954085-54-1 | DOI: | 10.18653/v1/2021.findings-acl.445 | Description: | Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Online, August 1-6, 2021 | Rights: | ©2021 Association for Computational Linguistics This publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/) The following publication Shuaiqi Liu, Jiannong Cao, Ruosong Yang, and Zhiyuan Wen. 2021. Highlight-Transformer: Leveraging Key Phrase Aware Attention to Improve Abstractive Multi-Document Summarization. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pages 5021–5027, Online. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/2021.findings-acl.445. |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2021.findings-acl.445.pdf | 289.29 kB | Adobe PDF | View/Open |
Page views
46
Citations as of May 11, 2025
Downloads
22
Citations as of May 11, 2025

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.