Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/107878
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Computing | en_US |
dc.creator | Zhang, Y | en_US |
dc.creator | Li, J | en_US |
dc.creator | Li, W | en_US |
dc.date.accessioned | 2024-07-15T07:55:29Z | - |
dc.date.available | 2024-07-15T07:55:29Z | - |
dc.identifier.isbn | 979-8-89176-060-8 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/107878 | - |
dc.description | The 2023 Conference on Empirical Methods in Natural Language Processing, December 6-10, 2023, Singapore | en_US |
dc.language.iso | en | en_US |
dc.publisher | Association for Computational Linguistics (ACL) | en_US |
dc.rights | © 2023 Association for Computational Linguistics | en_US |
dc.rights | Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/). | en_US |
dc.rights | The following publication Xianming Li and Jing Li. 2024. BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 792–804, Mexico City, Mexico. Association for Computational Linguistics is available at https://aclanthology.org/2023.emnlp-main.203/. | en_US |
dc.title | VIBE : topic-driven temporal adaptation for twitter classification | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 3340 | en_US |
dc.identifier.epage | 3354 | en_US |
dcterms.abstract | Language features are evolving in real-world social media, resulting in the deteriorating performance of text classification in dynamics. To address this challenge, we study temporal adaptation, where models trained on past data are tested in the future. Most prior work focused on continued pretraining or knowledge updating, which may compromise their performance on noisy social media data. To tackle this issue, we reflect feature change via modeling latent topic evolution and propose a novel model, VIBE: Variational Information Bottleneck for Evolutions. Concretely, we first employ two Information Bottleneck (IB) regularizers to distinguish past and future topics. Then, the distinguished topics work as adaptive features via multi-task training with timestamp and class label prediction. In adaptive learning, VIBE utilizes retrieved unlabeled data from online streams created posterior to training data time. Substantial Twitter experiments on three classification tasks show that our model, with only 3% of data, significantly outperforms previous state-of-the-art continued-pretraining methods. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, p. 3340–3354, Singapore. Association for Computational Linguistics, 2023 | en_US |
dcterms.issued | 2023 | - |
dc.relation.conference | Conference on Empirical Methods in Natural Language Processing [EMNLP] | en_US |
dc.description.validate | 202407 bcwh | en_US |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | a3033 | - |
dc.identifier.SubFormID | 49242 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | CC | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2023.emnlp-main.203.pdf | 2.11 MB | Adobe PDF | View/Open |
Page views
104
Citations as of Apr 13, 2025
Downloads
20
Citations as of Apr 13, 2025

Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.