Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/91924
PIRA download icon_1.1View/Download Full Text
Title: Is domain adaptation worth your investment? Comparing BERT and FinBERT on financial tasks
Authors: Peng, B 
Chersoni, E 
Hsu, YY 
Huang, CR 
Issue Date: 2021
Source: In Proceedings of the Third Workshop on Economics and Natural Language Processing (ECONLP 2021), November 11, 2021, Punta Cana, Dominican Republic and Online, p. 37–44. Stroudsburg, PA: Association for Computational Linguistics (ACL), 2021
Abstract: With the recent rise in popularity of Transformer models in Natural Language Processing, research efforts have been dedicated to the development of domain-adapted versions of BERT-like architectures.
In this study, we focus on FinBERT, a Transformer model trained on text from the financial domain. By comparing its performances with the original BERT on a wide variety of financial text processing tasks, we found continual pretraining from the original model to be the more beneficial option. Domain-specific pretraining from scratch, conversely, seems to be less effective.
Publisher: Association for Computational Linguistics (ACL)
ISBN: 978-1-954085-84-8
DOI: 10.18653/v1/2021.econlp-1.5
Rights: ©2021 Association for Computational Linguistics
ACL materials are Copyright © 1963–2021 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).
The following publication Peng, B., Chersoni, E., Hsu, Y. Y., & Huang, C. R. (2021, November). Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks. In Proceedings of the Third Workshop on Economics and Natural Language Processing (pp. 37-44) is available at https://doi.org/10.18653/v1/2021.econlp-1.5
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
Peng_Domain_Adaptation_Worth.pdf396.5 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

98
Last Week
0
Last month
Citations as of Apr 14, 2024

Downloads

38
Citations as of Apr 14, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.