Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114403
PIRA download icon_1.1View/Download Full Text
Title: Probing numerical concepts in financial text with BERT models
Authors: Guo, S 
Qiu, L 
Chersoni, E 
Issue Date: 2025
Source: In Proceedings of the Eighth Financial Technology and Natural Language Processing and the 1st Agent AI for Scenario Planning, p. 73-78, Jeju, South Korea
Abstract: Numbers are notoriously an essential component of financial texts, and their correct understanding is key to an automatic system for efficiently extracting and processing information.
In our paper, we analyze the embeddings of different BERT-based models by testing them on supervised and unsupervised probing tasks for financial numeral understanding and value ordering.
Our results show that LMs with different types of training have complementary strengths, thus suggesting that their embeddings should be combined for more stable performances across tasks and categories.
Description: Joint Workshop of the 8th Financial Technology and Natural Language Processing (FinNLP) and the 1st Agent AI for Scenario Planning (AgentScen), Jeju, South Korea, August 3, 2024
Rights: Posted with permission of the FinNLP.
The following publication Shanyue Guo, Le Qiu, and Emmanuele Chersoni. 2024. Probing Numerical Concepts in Financial Text with BERT Models. In Proceedings of the Eighth Financial Technology and Natural Language Processing and the 1st Agent AI for Scenario Planning, pages 73–78, Jeju, South Korea is available at https://aclanthology.org/2024.finnlp-2.7/.
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
Guo_Probing_Numerical_Concepts.pdf270.29 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.