Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/109116
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Computing-
dc.creatorYan, Y-
dc.creatorZhang, BW-
dc.creatorDing, G-
dc.creatorLi, W-
dc.creatorZhang, J-
dc.creatorLi, JJ-
dc.creatorGao, W-
dc.date.accessioned2024-09-19T03:13:21Z-
dc.date.available2024-09-19T03:13:21Z-
dc.identifier.issn1866-9956-
dc.identifier.urihttp://hdl.handle.net/10397/109116-
dc.language.isoenen_US
dc.publisherSpringer New York LLCen_US
dc.rights© The Author(s) 2023, corrected publication 2023en_US
dc.rightsThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.en_US
dc.rightsThe following publication Yan, Y., Zhang, BW., Ding, G. et al. O2-Bert: Two-Stage Target-Based Sentiment Analysis. Cogn Comput 16, 158–176 (2024) is available at https://doi.org/10.1007/s12559-023-10191-y.en_US
dc.subjectEntity length predictionen_US
dc.subjectEntity number predictionen_US
dc.subjectEntity starting annotationen_US
dc.subjectO2-Berten_US
dc.subjectOSC-Berten_US
dc.subjectOTE-Berten_US
dc.titleO²-Bert : two-stage target-based sentiment analysisen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage158-
dc.identifier.epage176-
dc.identifier.volume16-
dc.identifier.issue1-
dc.identifier.doi10.1007/s12559-023-10191-y-
dcterms.abstractTarget-based sentiment analysis (TBSA) is one of the most important NLP research topics for widespread applications. However, the task is challenging, especially when the targets contain multiple words or do not exist in the sequences. Conventional approaches cannot accurately extract the (target, sentiment) pairs due to the limitations of the fixed end-to-end architecture design. In this paper, we propose a framework named O2-Bert, which consists of Opinion target extraction (OTE-Bert) and Opinion sentiment classification (OSC-Bert) to complete the task in two stages. More specifically, we divide the OTE-Bert into three modules. First, an entity number prediction module predicts the number of entities in a sequence, even in an extreme situation where no entities are contained. Afterwards, with predicted number of entities, an entity starting annotation module is responsible for predicting their starting positions. Finally, an entity length prediction module predicts the lengths of these entities, and thus, accomplishes target extraction. In OSC-Bert, the sentiment polarities of extracted targets from OTE-Bert. According to the characteristics of BERT encoders, our framework can be adapted to short English sequences without domain limitations. For other languages, our approach might work through altering the tokenization. Experimental results on the SemEval 2014-16 benchmarks show that the proposed model achieves competitive performances on both domains (restaurants and laptops) and both tasks (target extraction and sentiment classification), with F1-score as evaluated metrics. Specifically, OTE-Bert achieves 84.63%, 89.20%, 83.16%, and 86.88% F1 scores for target extraction, while OSCBert achieves 82.90%, 80.73%, 76.94%, and 83.58% F1 scores for sentiment classification, on the chosen benchmarks. The statistics validate the effectiveness and robustness of our approach and the new “two-stage paradigm”. In future work, we will explore more possibilities of the new paradigm on other NLP tasks.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationCognitive computation, Jan. 2024, v. 16, no. 1, p. 158-176-
dcterms.isPartOfCognitive computation-
dcterms.issued2024-01-
dc.identifier.scopus2-s2.0-85169161519-
dc.identifier.eissn1866-9964-
dc.description.validate202409 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextFundamental Research Funds for Central Universities and National foreign specialized projectsen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
s12559-023-10191-y.pdf5.33 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

22
Citations as of Nov 24, 2024

Downloads

6
Citations as of Nov 24, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.