Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/118208
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorSchool of Fashion and Textiles-
dc.creatorJiang, L-
dc.creatorZeng, F-
dc.creatorYu, A-
dc.date.accessioned2026-03-23T01:37:11Z-
dc.date.available2026-03-23T01:37:11Z-
dc.identifier.issn1534-4320-
dc.identifier.urihttp://hdl.handle.net/10397/118208-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2025 The Authors. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. For more information, see https://creativecommons.org/licenses/by-nc-nd/4.0/.en_US
dc.rightsThe following publication L. Jiang, F. Zeng and A. Yu, "Comparative Learning for Cross-Subject Finger Movement Recognition in Three Arm Postures via Data Glove," in IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 33, pp. 2531-2541, 2025 is available at https://doi.org/10.1109/TNSRE.2025.3583303.en_US
dc.subjectComparative learningen_US
dc.subjectCross-subjecten_US
dc.subjectData gloveen_US
dc.subjectFinger movement recognitionen_US
dc.subjectSiamese networken_US
dc.titleComparative learning for cross-subject finger movement recognition in three arm postures via data gloveen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage2531-
dc.identifier.epage2541-
dc.identifier.volume33-
dc.identifier.doi10.1109/TNSRE.2025.3583303-
dcterms.abstractReliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed limit the generalization of data-glove models. We present CLAPISA, a contrastive-learning framework that embeds a Siamese network into a CNN-LSTM spatiotemporal pipeline for cross-subject gesture recognition. Training employs a 1: 2 positive-to-negative pairing strategy and an empirically optimized margin of 1.0, enabling the network to form subject-invariant, rehabilitation-relevant embeddings. Evaluated on a bending-sensor dataset containing twenty young adults, CLAPISA attains an average accuracy of 96.71 % under leave-one-subject-out cross-validation outperforming five baseline models and reducing errors for the most challenging subjects by up to 12.3 %. Although current validation is limited to a young cohort, the framework’s data efficiency and subject-invariant design indicate strong potential for extension to elderly and neurologically impaired populations, our next work will be to collect such data for further verification.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on neural systems and rehabilitation engineering, 2025, v. 33, p. 2531-2541-
dcterms.isPartOfIEEE transactions on neural systems and rehabilitation engineering-
dcterms.issued2025-
dc.identifier.scopus2-s2.0-105009421627-
dc.identifier.pmid40569808-
dc.identifier.eissn1558-0210-
dc.description.validate202603 bcjz-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported in part by the Key Laboratory of Intelligent Textile and Flexible Interconnection, Zhejiang Province under Grant YB16, in part by China Postdoctoral Science Foundation under Grant 2024M750518, and in part by the Natural Science Foundation of Ningbo under Grant 2024J235 and Grant 2022J138.en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Jiang_Comparative_Learning_Cross-subject.pdf2.6 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

SCOPUSTM   
Citations

1
Citations as of May 8, 2026

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.