Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105690
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorMa, S-
dc.creatorSun, X-
dc.creatorXu, J-
dc.creatorWang, H-
dc.creatorLi, W-
dc.creatorSu, Q-
dc.date.accessioned2024-04-15T07:35:55Z-
dc.date.available2024-04-15T07:35:55Z-
dc.identifier.isbn978-1-945626-75-3 (Volume 1)-
dc.identifier.isbn978-1-945626-76-0 (Volume 2)-
dc.identifier.urihttp://hdl.handle.net/10397/105690-
dc.description55th Annual Meeting of the Association for Computational Linguistics, July 30-August 4, 2017, Vancouver, Canadaen_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights© 2017 Association for Computational Linguisticsen_US
dc.rightsThis publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/)en_US
dc.rightsThe following publication Shuming Ma, Xu Sun, Jingjing Xu, Houfeng Wang, Wenjie Li, and Qi Su. 2017. Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 635–640, Vancouver, Canada. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/P17-2100.en_US
dc.titleImproving semantic relevance for sequence-to-sequence learning of Chinese social media text summarizationen_US
dc.typeConference Paperen_US
dc.identifier.spage635-
dc.identifier.epage640-
dc.identifier.volume2-
dc.identifier.doi10.18653/v1/P17-2100-
dcterms.abstractCurrent Chinese social media text summarization models are based on an encoder-decoder framework. Although its generated summaries are similar to source texts literally, they have low semantic relevance. In this work, our goal is to improve semantic relevance between source texts and summaries for Chinese social media summarization. We introduce a Semantic Relevance Based neural model to encourage high semantic similarity between texts and summaries. In our model, the source text is represented by a gated attention encoder, while the summary representation is produced by a decoder. Besides, the similarity score between the representations is maximized during training. Our experiments show that the proposed model outperforms baseline systems on a social media corpus.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn The 55th Annual Meeting of the Association for Computational Linguistics: Proceedings of the Conference, Vol. 2 (Short Papers), p. 635-640. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2017-
dcterms.issued2017-
dc.identifier.scopus2-s2.0-85040635086-
dc.relation.ispartofbookThe 55th Annual Meeting of the Association for Computational Linguistics: Proceedings of the Conference, Vol. 2 (Short Papers)-
dc.relation.conferenceAnnual Meeting of the Association for Computational Linguistics [ACL]-
dc.description.validate202402 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberCOMP-1354en_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNSFCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS14317940en_US
dc.description.oaCategoryCCen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
P17-2100.pdf483.28 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

12
Citations as of May 12, 2024

Downloads

2
Citations as of May 12, 2024

SCOPUSTM   
Citations

42
Citations as of May 17, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.