Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/109530
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Chinese and Bilingual Studies | en_US |
dc.creator | Zhao, Q | en_US |
dc.creator | Long, Y | en_US |
dc.creator | Jiang, X | en_US |
dc.creator | Wang, Z | en_US |
dc.creator | Huang, CR | en_US |
dc.creator | Zhou, G | en_US |
dc.date.accessioned | 2024-11-06T02:20:15Z | - |
dc.date.available | 2024-11-06T02:20:15Z | - |
dc.identifier.uri | http://hdl.handle.net/10397/109530 | - |
dc.language.iso | en | en_US |
dc.publisher | Cambridge University Press | en_US |
dc.rights | © The Author(s), 2024. Published by Cambridge University Press. This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited. | en_US |
dc.rights | The following publication Zhao, Q., Long, Y., Jiang, X., Wang, Z., Huang, C.-R., & Zhou, G. (2024). Linguistic synesthesia detection: Leveraging culturally enriched linguistic features. Natural Language Processing, 1–23 is available at https://doi.org/10.1017/nlp.2024.9. | en_US |
dc.subject | A neural network model | en_US |
dc.subject | Chinese | en_US |
dc.subject | Linguistic features | en_US |
dc.subject | Linguistic synesthesia | en_US |
dc.title | Linguistic synesthesia detection : leveraging culturally enriched linguistic features | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.doi | 10.1017/nlp.2024.9 | en_US |
dcterms.abstract | Linguistic synesthesia as a productive figurative language usage has received little attention in the field of Natural Language Processing (NLP). Although linguistic synesthesia is similar to metaphor concerning involving conceptual mappings and showing great usefulness in the NLP tasks such as sentiment analysis and stance detection, the well-studied methods of metaphor detection cannot be applied to the detection of linguistic synesthesia directly. This study incorporates comprehensive linguistic features (i.e., character and radical information, word segmentation information, and part-of-speech tagging) into a neural model to detect linguistic synesthetic usages in a sentence automatically. In particular, we employ a span-based boundary detection model to extract sensory words. In addition, a joint model is proposed to detect the original and synesthetic modalities of the sensory words collectively. Based on the experiments, our model is shown to achieve state-of-the-art results on the dataset for linguistic synesthesia detection. The results prove that leveraging culturally enriched linguistic features and joint learning are effective in linguistic synesthesia detection. Furthermore, as the proposed model leverages non-language-specific linguistic features, the model would be applied to the detection of linguistic synesthesia in other languages. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Natural language processing, Published online by Cambridge University Press: 09 September 2024, FirstView, https://doi.org/10.1017/nlp.2024.9 | en_US |
dcterms.isPartOf | Natural language processing | en_US |
dcterms.issued | 2024 | - |
dc.identifier.eissn | 2977-0424 | en_US |
dc.description.validate | 202411 bcch | en_US |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | OA_TA | - |
dc.description.fundingSource | Self-funded | en_US |
dc.description.pubStatus | Early release | en_US |
dc.description.TA | CUP (2024) | en_US |
dc.description.oaCategory | TA | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Zhao_Linguistic_Synesthesia_Detection.pdf | 3.31 MB | Adobe PDF | View/Open |
Page views
16
Citations as of Nov 24, 2024
Downloads
9
Citations as of Nov 24, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.