Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107876
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorXu, Cen_US
dc.creatorLi, Jen_US
dc.creatorLi, Pen_US
dc.creatorYang, Men_US
dc.date.accessioned2024-07-15T07:55:28Z-
dc.date.available2024-07-15T07:55:28Z-
dc.identifier.isbn978-1-959429-62-3en_US
dc.identifier.urihttp://hdl.handle.net/10397/107876-
dc.descriptionThe 61st Annual Meeting of the Association for Computational Linguistics, Toronto, Canada, July 9-14, 2023en_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights© 2023 Association for Computational Linguisticsen_US
dc.rightsMaterials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Chunpu Xu, Jing Li, Piji Li, and Min Yang. 2023. Topic-Guided Self-Introduction Generation for Social Media Users. In Findings of the Association for Computational Linguistics: ACL 2023, pages 11387–11402, Toronto, Canada. Association for Computational Linguistics is available at https://aclanthology.org/2023.findings-acl.722/.en_US
dc.titleTopic-guided self-introduction generation for social media usersen_US
dc.typeConference Paperen_US
dc.identifier.spage11387en_US
dc.identifier.epage11402en_US
dcterms.abstractMillions of users are active on social media. To allow users to better showcase themselves and network with others, we explore the auto-generation of social media self-introduction, a short sentence outlining a user’s personal interests. While most prior work profiling users with tags (e.g., ages), we investigate sentence-level self-introductions to provide a more natural and engaging way for users to know each other. Here we exploit a user’s tweeting history to generate their self-introduction. The task is non-trivial because the history content may be lengthy, noisy, and exhibit various personal interests. To address this challenge, we propose a novel unified topic-guided encoder-decoder (UTGED) framework; it models latent topics to reflect salient user interest, whose topic mixture then guides encoding a user’s history and topic words control decoding their self-introduction. For experiments, we collect a large-scale Twitter dataset, and extensive results show the superiority of our UTGED to the advanced encoder-decoder models without topic modeling.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Findings of the Association for Computational Linguistics: ACL 2023, p. 11387–11402, Toronto, Canada. Association for Computational Linguisticsen_US
dcterms.issued2023-
dc.relation.conferenceAnnual Meeting of the Association for Computational Linguistics [ACL]en_US
dc.description.validate202407 bcwhen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumbera3033-
dc.identifier.SubFormID49244-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of Chinaen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
2023.findings-acl.722.pdf793.69 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

132
Citations as of Nov 10, 2025

Downloads

48
Citations as of Nov 10, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.