Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105690
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Computing | - |
dc.creator | Ma, S | - |
dc.creator | Sun, X | - |
dc.creator | Xu, J | - |
dc.creator | Wang, H | - |
dc.creator | Li, W | - |
dc.creator | Su, Q | - |
dc.date.accessioned | 2024-04-15T07:35:55Z | - |
dc.date.available | 2024-04-15T07:35:55Z | - |
dc.identifier.isbn | 978-1-945626-75-3 (Volume 1) | - |
dc.identifier.isbn | 978-1-945626-76-0 (Volume 2) | - |
dc.identifier.uri | http://hdl.handle.net/10397/105690 | - |
dc.description | 55th Annual Meeting of the Association for Computational Linguistics, July 30-August 4, 2017, Vancouver, Canada | en_US |
dc.language.iso | en | en_US |
dc.publisher | Association for Computational Linguistics (ACL) | en_US |
dc.rights | © 2017 Association for Computational Linguistics | en_US |
dc.rights | This publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/) | en_US |
dc.rights | The following publication Shuming Ma, Xu Sun, Jingjing Xu, Houfeng Wang, Wenjie Li, and Qi Su. 2017. Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 635–640, Vancouver, Canada. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/P17-2100. | en_US |
dc.title | Improving semantic relevance for sequence-to-sequence learning of Chinese social media text summarization | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 635 | - |
dc.identifier.epage | 640 | - |
dc.identifier.volume | 2 | - |
dc.identifier.doi | 10.18653/v1/P17-2100 | - |
dcterms.abstract | Current Chinese social media text summarization models are based on an encoder-decoder framework. Although its generated summaries are similar to source texts literally, they have low semantic relevance. In this work, our goal is to improve semantic relevance between source texts and summaries for Chinese social media summarization. We introduce a Semantic Relevance Based neural model to encourage high semantic similarity between texts and summaries. In our model, the source text is represented by a gated attention encoder, while the summary representation is produced by a decoder. Besides, the similarity score between the representations is maximized during training. Our experiments show that the proposed model outperforms baseline systems on a social media corpus. | - |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In The 55th Annual Meeting of the Association for Computational Linguistics: Proceedings of the Conference, Vol. 2 (Short Papers), p. 635-640. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2017 | - |
dcterms.issued | 2017 | - |
dc.identifier.scopus | 2-s2.0-85040635086 | - |
dc.relation.ispartofbook | The 55th Annual Meeting of the Association for Computational Linguistics: Proceedings of the Conference, Vol. 2 (Short Papers) | - |
dc.relation.conference | Annual Meeting of the Association for Computational Linguistics [ACL] | - |
dc.description.validate | 202402 bcch | - |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | COMP-1354 | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | NSFC | en_US |
dc.description.pubStatus | Published | en_US |
dc.identifier.OPUS | 14317940 | en_US |
dc.description.oaCategory | CC | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
P17-2100.pdf | 483.28 kB | Adobe PDF | View/Open |
Page views
12
Citations as of May 12, 2024
Downloads
2
Citations as of May 12, 2024
SCOPUSTM
Citations
42
Citations as of May 17, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.