Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105511
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorWang, L-
dc.creatorLi, J-
dc.creatorZeng, X-
dc.creatorZhang, H-
dc.creatorWong, KF-
dc.date.accessioned2024-04-15T07:34:47Z-
dc.date.available2024-04-15T07:34:47Z-
dc.identifier.isbn978-1-952148-60-6-
dc.identifier.urihttp://hdl.handle.net/10397/105511-
dc.description2020 Conference on Empirical Methods in Natural Language Processing, 16th-20th November 2020, Onlineen_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights© 2020 Association for Computational Linguisticsen_US
dc.rightsThis publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/)en_US
dc.rightsThe following publication Lingzhi Wang, Jing Li, Xingshan Zeng, Haisong Zhang, and Kam-Fai Wong. 2020. Continuity of Topic, Interaction, and Query: Learning to Quote in Online Conversations. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 6640–6650, Online. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/2020.emnlp-main.538.en_US
dc.titleContinuity of topic, interaction, and query : learning to quote in online conversationsen_US
dc.typeConference Paperen_US
dc.identifier.spage6640-
dc.identifier.epage6650-
dc.identifier.doi10.18653/v1/2020.emnlp-main.538-
dcterms.abstractQuotations are crucial for successful explanations and persuasions in interpersonal communications. However, finding what to quote in a conversation is challenging for both humans and machines. This work studies automatic quotation generation in an online conversation and explores how language consistency affects whether a quotation fits the given context. Here, we capture the contextual consistency of a quotation in terms of latent topics, interactions with the dialogue history, and coherence to the query turn’s existing contents. Further, an encoder-decoder neural framework is employed to continue the context with a quotation via language generation. Experiment results on two large-scale datasets in English and Chinese demonstrate that our quotation generation model outperforms the state-of-the-art models. Further analysis shows that topic, interaction, and query consistency are all helpful to learn how to quote in online conversations.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, p. 6640-6650. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2020-
dcterms.issued2020-
dc.relation.ispartofbookProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing-
dc.relation.conferenceConference on Empirical Methods in Natural Language Processing [EMNLP]-
dc.description.validate202402 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberCOMP-0195en_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNSFC; Startup Funden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS50290579en_US
dc.description.oaCategoryCCen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
2020.emnlp-main.538.pdf741.89 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

102
Last Week
5
Last month
Citations as of Nov 30, 2025

Downloads

22
Citations as of Nov 30, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.