Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105509
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Computing | - |
| dc.creator | Wu, X | - |
| dc.creator | Cai, Y | - |
| dc.creator | Kai, Y | - |
| dc.creator | Wang, T | - |
| dc.creator | Li, Q | - |
| dc.date.accessioned | 2024-04-15T07:34:47Z | - |
| dc.date.available | 2024-04-15T07:34:47Z | - |
| dc.identifier.isbn | 978-1-952148-60-6 | - |
| dc.identifier.uri | http://hdl.handle.net/10397/105509 | - |
| dc.description | 2020 Conference on Empirical Methods in Natural Language Processing, 16th-20th November 2020, Online | en_US |
| dc.language.iso | en | en_US |
| dc.publisher | Association for Computational Linguistics (ACL) | en_US |
| dc.rights | © 2020 Association for Computational Linguistics | en_US |
| dc.rights | This publication is licensed on a Creative Commons Attribution 4.0 International License. (https://creativecommons.org/licenses/by/4.0/) | en_US |
| dc.rights | The following publication Xin Wu, Yi Cai, Yang Kai, Tao Wang, and Qing Li. 2020. Task-oriented Domain-specific Meta-Embedding for Text Classification. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3508–3513, Online. Association for Computational Linguistics is available at https://doi.org/10.18653/v1/2020.emnlp-main.282. | en_US |
| dc.title | Task-oriented domain-specific meta-embedding for text classification | en_US |
| dc.type | Conference Paper | en_US |
| dc.identifier.spage | 3508 | - |
| dc.identifier.epage | 3513 | - |
| dc.identifier.doi | 10.18653/v1/2020.emnlp-main.282 | - |
| dcterms.abstract | Meta-embedding learning, which combines complementary information in different word embeddings, have shown superior performances across different Natural Language Processing tasks. However, domain-specific knowledge is still ignored by existing meta-embedding methods, which results in unstable performances across specific domains. Moreover, the importance of general and domain word embeddings is related to downstream tasks, how to regularize meta-embedding to adapt downstream tasks is an unsolved problem. In this paper, we propose a method to incorporate both domain-specific and task-oriented information into meta-embeddings. We conducted extensive experiments on four text classification datasets and the results show the effectiveness of our proposed method. | - |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, p. 3508-3513. Stroudsburg, PA, USA: Association for Computational Linguistics (ACL), 2020 | - |
| dcterms.issued | 2020 | - |
| dc.relation.ispartofbook | Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing | - |
| dc.relation.conference | Conference on Empirical Methods in Natural Language Processing [EMNLP] | - |
| dc.description.validate | 202402 bcch | - |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | COMP-0189 | en_US |
| dc.description.fundingSource | RGC | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.identifier.OPUS | 49985237 | en_US |
| dc.description.oaCategory | CC | en_US |
| Appears in Collections: | Conference Paper | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 2020.emnlp-main.282.pdf | 372.24 kB | Adobe PDF | View/Open |
Page views
93
Last Week
3
3
Last month
Citations as of Nov 30, 2025
Downloads
23
Citations as of Nov 30, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



