Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/91924
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Chinese and Bilingual Studiesen_US
dc.creatorPeng, Ben_US
dc.creatorChersoni, Een_US
dc.creatorHsu, YYen_US
dc.creatorHuang, CRen_US
dc.date.accessioned2022-01-18T06:24:42Z-
dc.date.available2022-01-18T06:24:42Z-
dc.identifier.isbn978-1-954085-84-8en_US
dc.identifier.urihttp://hdl.handle.net/10397/91924-
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguistics (ACL)en_US
dc.rights©2021 Association for Computational Linguisticsen_US
dc.rightsACL materials are Copyright © 1963–2021 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Peng, B., Chersoni, E., Hsu, Y. Y., & Huang, C. R. (2021, November). Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks. In Proceedings of the Third Workshop on Economics and Natural Language Processing (pp. 37-44) is available at https://doi.org/10.18653/v1/2021.econlp-1.5en_US
dc.titleIs domain adaptation worth your investment? Comparing BERT and FinBERT on financial tasksen_US
dc.typeConference Paperen_US
dc.identifier.spage37en_US
dc.identifier.epage44en_US
dc.identifier.doi10.18653/v1/2021.econlp-1.5en_US
dcterms.abstractWith the recent rise in popularity of Transformer models in Natural Language Processing, research efforts have been dedicated to the development of domain-adapted versions of BERT-like architectures.en_US
dcterms.abstractIn this study, we focus on FinBERT, a Transformer model trained on text from the financial domain. By comparing its performances with the original BERT on a wide variety of financial text processing tasks, we found continual pretraining from the original model to be the more beneficial option. Domain-specific pretraining from scratch, conversely, seems to be less effective.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of the Third Workshop on Economics and Natural Language Processing (ECONLP 2021), November 11, 2021, Punta Cana, Dominican Republic and Online, p. 37–44. Stroudsburg, PA: Association for Computational Linguistics (ACL), 2021en_US
dcterms.issued2021-
dc.relation.ispartofbookProceedings of the Third Workshop on Economics and Natural Language Processing (ECONLP 2021), November 11, 2021, Punta Cana, Dominican Republic and Onlineen_US
dc.relation.conferenceWorkshop on Economics and Natural Language Processing [ECONLP]en_US
dc.description.validate202201 bcvcen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumbera1141-n01-
dc.identifier.SubFormID43994-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextW16Hen_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Peng_Domain_Adaptation_Worth.pdf396.5 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

98
Last Week
0
Last month
Citations as of Apr 14, 2024

Downloads

38
Citations as of Apr 14, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.