Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105509
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorWu, X-
dc.creatorCai, Y-
dc.creatorKai, Y-
dc.creatorWang, T-
dc.creatorLi, Q-
dc.date.accessioned2024-04-15T07:34:47Z-
dc.date.available2024-04-15T07:34:47Z-
dc.identifier.isbn978-1-952148-60-6-
dc.identifier.urihttp://hdl.handle.net/10397/105509-
dc.description2020 Conference on Empirical Methods in Natural Language Processing, 16th-20th November 2020, Onlineen_US
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights© 2020 Association for Computational Linguisticsen_US
dc.rightsThis publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/)en_US
dc.rightsThe following publication Xin Wu, Yi Cai, Yang Kai, Tao Wang, and Qing Li. 2020. Task-oriented Domain-specific Meta-Embedding for Text Classification. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3508–3513, Online. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/2020.emnlp-main.282.en_US
dc.titleTask-oriented domain-specific meta-embedding for text classificationen_US
dc.typeConference Paperen_US
dc.identifier.spage3508-
dc.identifier.epage3513-
dc.identifier.doi10.18653/v1/2020.emnlp-main.282-
dcterms.abstractMeta-embedding learning, which combines complementary information in different word embeddings, have shown superior performances across different Natural Language Processing tasks. However, domain-specific knowledge is still ignored by existing meta-embedding methods, which results in unstable performances across specific domains. Moreover, the importance of general and domain word embeddings is related to downstream tasks, how to regularize meta-embedding to adapt downstream tasks is an unsolved problem. In this paper, we propose a method to incorporate both domain-specific and task-oriented information into meta-embeddings. We conducted extensive experiments on four text classification datasets and the results show the effectiveness of our proposed method.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, p. 3508-3513. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2020-
dcterms.issued2020-
dc.relation.ispartofbookProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing-
dc.relation.conferenceConference on Empirical Methods in Natural Language Processing [EMNLP]-
dc.description.validate202402 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberCOMP-0189en_US
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS49985237en_US
dc.description.oaCategoryCCen_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
2020.emnlp-main.282.pdf372.24 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

93
Last Week
3
Last month
Citations as of Nov 30, 2025

Downloads

23
Citations as of Nov 30, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.