Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/91924
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Chinese and Bilingual Studies | en_US |
dc.creator | Peng, B | en_US |
dc.creator | Chersoni, E | en_US |
dc.creator | Hsu, YY | en_US |
dc.creator | Huang, CR | en_US |
dc.date.accessioned | 2022-01-18T06:24:42Z | - |
dc.date.available | 2022-01-18T06:24:42Z | - |
dc.identifier.isbn | 978-1-954085-84-8 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/91924 | - |
dc.language.iso | en | en_US |
dc.publisher | Association for Computational Linguistics (ACL) | en_US |
dc.rights | ©2021 Association for Computational Linguistics | en_US |
dc.rights | ACL materials are Copyright © 1963–2021 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/). | en_US |
dc.rights | The following publication Peng, B., Chersoni, E., Hsu, Y. Y., & Huang, C. R. (2021, November). Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks. In Proceedings of the Third Workshop on Economics and Natural Language Processing (pp. 37-44) is available at https://doi.org/10.18653/v1/2021.econlp-1.5 | en_US |
dc.title | Is domain adaptation worth your investment? Comparing BERT and FinBERT on financial tasks | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 37 | en_US |
dc.identifier.epage | 44 | en_US |
dc.identifier.doi | 10.18653/v1/2021.econlp-1.5 | en_US |
dcterms.abstract | With the recent rise in popularity of Transformer models in Natural Language Processing, research efforts have been dedicated to the development of domain-adapted versions of BERT-like architectures. | en_US |
dcterms.abstract | In this study, we focus on FinBERT, a Transformer model trained on text from the financial domain. By comparing its performances with the original BERT on a wide variety of financial text processing tasks, we found continual pretraining from the original model to be the more beneficial option. Domain-specific pretraining from scratch, conversely, seems to be less effective. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In Proceedings of the Third Workshop on Economics and Natural Language Processing (ECONLP 2021), November 11, 2021, Punta Cana, Dominican Republic and Online, p. 37–44. Stroudsburg, PA: Association for Computational Linguistics (ACL), 2021 | en_US |
dcterms.issued | 2021 | - |
dc.relation.ispartofbook | Proceedings of the Third Workshop on Economics and Natural Language Processing (ECONLP 2021), November 11, 2021, Punta Cana, Dominican Republic and Online | en_US |
dc.relation.conference | Workshop on Economics and Natural Language Processing [ECONLP] | en_US |
dc.description.validate | 202201 bcvc | en_US |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | a1141-n01 | - |
dc.identifier.SubFormID | 43994 | - |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | W16H | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | CC | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Peng_Domain_Adaptation_Worth.pdf | 396.5 kB | Adobe PDF | View/Open |
Page views
232
Last Week
0
0
Last month
Citations as of Apr 13, 2025
Downloads
128
Citations as of Apr 13, 2025

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.