Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/114403
Title: | Probing numerical concepts in financial text with BERT models | Authors: | Guo, S Qiu, L Chersoni, E |
Issue Date: | 2025 | Source: | In Proceedings of the Eighth Financial Technology and Natural Language Processing and the 1st Agent AI for Scenario Planning, p. 73-78, Jeju, South Korea | Abstract: | Numbers are notoriously an essential component of financial texts, and their correct understanding is key to an automatic system for efficiently extracting and processing information. In our paper, we analyze the embeddings of different BERT-based models by testing them on supervised and unsupervised probing tasks for financial numeral understanding and value ordering. Our results show that LMs with different types of training have complementary strengths, thus suggesting that their embeddings should be combined for more stable performances across tasks and categories. |
Description: | Joint Workshop of the 8th Financial Technology and Natural Language Processing (FinNLP) and the 1st Agent AI for Scenario Planning (AgentScen), Jeju, South Korea, August 3, 2024 | Rights: | Posted with permission of the FinNLP. The following publication Shanyue Guo, Le Qiu, and Emmanuele Chersoni. 2024. Probing Numerical Concepts in Financial Text with BERT Models. In Proceedings of the Eighth Financial Technology and Natural Language Processing and the 1st Agent AI for Scenario Planning, pages 73–78, Jeju, South Korea is available at https://aclanthology.org/2024.finnlp-2.7/. |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Guo_Probing_Numerical_Concepts.pdf | 270.29 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.