Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/95825
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Chinese and Bilingual Studies | en_US |
| dc.creator | Peng, B | en_US |
| dc.creator | Chersoni, E | en_US |
| dc.creator | Hsu, YY | en_US |
| dc.creator | Huang, CR | en_US |
| dc.date.accessioned | 2022-10-17T07:47:12Z | - |
| dc.date.available | 2022-10-17T07:47:12Z | - |
| dc.identifier.isbn | 979-10-95546-74-0 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/95825 | - |
| dc.description | 4th Financial Narrative Processing Workshop FNP 2022, Language Resources and Evaluation Conference, 24 June 2022, Marseille, France | en_US |
| dc.language.iso | en | en_US |
| dc.publisher | European Language Resources Association (ELRA) | en_US |
| dc.rights | © European Language Resources Association (ELRA) | en_US |
| dc.rights | These workshop proceedings are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (https://creativecommons.org/licenses/by-nc/4.0/) | en_US |
| dc.rights | The following publication Peng, B., Chersoni, E., Hsu, Y. Y., & Huang, C. R. (2022, June). Discovering Financial Hypernyms by Prompting Masked Language Models. In M. El-Haj, P. Rayson & N. Zmandar (Eds.), Proceedings of the 4th Financial Narrative Processing Workshop (FNP 2022) (pp. 10-16). Paris: European Language Resources Association (ELRA) is available at http://www.lrec-conf.org/proceedings/lrec2022/workshops/FNP/index.html | en_US |
| dc.subject | Transformers | en_US |
| dc.subject | Semantic relations | en_US |
| dc.subject | Language modeling | en_US |
| dc.subject | Financial natural language processing | en_US |
| dc.title | Discovering financial hypernyms by prompting masked language models | en_US |
| dc.type | Conference Paper | en_US |
| dc.identifier.spage | 10 | en_US |
| dc.identifier.epage | 16 | en_US |
| dcterms.abstract | With the rising popularity of Transformer-based language models, several studies have tried to exploit their masked language modeling capabilities to automatically extract relational linguistic knowledge, although this kind of research has rarely investigated semantic relations in specialized domains. The present study aims at testing a general-domain and a domain-adapted Transformer model on two datasets of financial term-hypernym pairs using the prompt methodology. Our results show that the differences of prompts impact critically on models’ performance, and that domain adaptation to financial texts generally improves the capacity of the models to associate the target terms with the right hypernyms, although the more successful models are those which retain a general-domain vocabulary. | en_US |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | In M El-Haj, P Rayson & N Zmandar (Eds.), Proceedings of the 4th Financial Narrative Processing Workshop (FNP 2022), p. 10-16. Paris: European Language Resources Association (ELRA). | en_US |
| dcterms.issued | 2022-06 | - |
| dc.relation.conference | Financial Narrative Processing Workshop [FNP] | en_US |
| dc.publisher.place | Paris, France | en_US |
| dc.description.validate | 202210 bcch | en_US |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | a1795 | - |
| dc.identifier.SubFormID | 45959 | - |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | Projects ZVYU and ZG9X | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.oaCategory | CC | en_US |
| Appears in Collections: | Conference Paper | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Peng_Financial_Hypernyms_Language.pdf | 366.24 kB | Adobe PDF | View/Open |
Page views
152
Last Week
7
7
Last month
Citations as of Nov 9, 2025
Downloads
41
Citations as of Nov 9, 2025
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



