Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105530
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorZeng, X-
dc.creatorLi, J-
dc.creatorWang, L-
dc.creatorMao, Z-
dc.creatorWong, KF-
dc.date.accessioned2024-04-15T07:34:52Z-
dc.date.available2024-04-15T07:34:52Z-
dc.identifier.isbn978-1-952148-25-5-
dc.identifier.urihttp://hdl.handle.net/10397/105530-
dc.description58th Annual Meeting of the Association for Computational Linguistics, Online, July 5th-10th, 2020en_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights© 2020 Association for Computational Linguisticsen_US
dc.rightsThis publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/)en_US
dc.rightsThe following publication Xingshan Zeng, Jing Li, Lu Wang, Zhiming Mao, and Kam-Fai Wong. 2020. Dynamic Online Conversation Recommendation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 3331–3341, Online. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/2020.acl-main.305.en_US
dc.titleDynamic online conversation recommendationen_US
dc.typeConference Paperen_US
dc.identifier.spage3331-
dc.identifier.epage3341-
dc.identifier.doi10.18653/v1/2020.acl-main.305-
dcterms.abstractTrending topics in social media content evolve over time, and it is therefore crucial to understand social media users and their interpersonal communications in a dynamic manner. Here we study dynamic online conversation recommendation, to help users engage in conversations that satisfy their evolving interests. While most prior work assumes static user interests, our model is able to capture the temporal aspects of user interests, and further handle future conversations that are unseen during training time. Concretely, we propose a neural architecture to exploit changes of user interactions and interests over time, to predict which discussions they are likely to enter. We conduct experiments on large-scale collections of Reddit conversations, and results on three subreddits show that our model significantly outperforms state-of-the-art models that make a static assumption of user interests. We further evaluate on handling “cold start”, and observe consistently better performance by our model when considering various degrees of sparsity of user’s chatting history and conversation contexts. Lastly, analyses on our model outputs indicate user interest change, explaining the advantage and efficacy of our approach.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, p. 3331-3341. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2020-
dcterms.issued2020-
dc.relation.ispartofbookProceedings of the 58th Annual Meeting of the Association for Computational Linguistics-
dc.relation.conferenceAnnual Meeting of the Association for Computational Linguistics [ACL]-
dc.description.validate202402 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberCOMP-0295en_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextStartup Funden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS25765052en_US
dc.description.oaCategoryCCen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
2020.acl-main.305.pdf565.98 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

92
Citations as of May 11, 2025

Downloads

23
Citations as of May 11, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.