Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/105229
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Rehabilitation Sciences-
dc.creatorLi, J-
dc.creatorZhong, J-
dc.creatorWang, N-
dc.date.accessioned2024-04-12T06:50:53Z-
dc.date.available2024-04-12T06:50:53Z-
dc.identifier.issn1662-453X-
dc.identifier.urihttp://hdl.handle.net/10397/105229-
dc.language.isoenen_US
dc.publisherFrontiers Research Foundationen_US
dc.rightsCopyright © 2023 Li, Zhong and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) (https://creativecommons.org/licenses/by/4.0/). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.en_US
dc.rightsThe following publication Li J, Zhong J and Wang N (2023) A multimodal human-robot sign language interaction framework applied in social robots. Front. Neurosci. 17:1168888 is available at https://doi.org/10.3389/fnins.2023.1168888.en_US
dc.subjectGesture recognitionen_US
dc.subjectHuman-robot interactionen_US
dc.subjectMultimodal sensorsen_US
dc.subjectSign languageen_US
dc.subjectSocial robotsen_US
dc.titleA multimodal human-robot sign language interaction framework applied in social robotsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume17-
dc.identifier.doi10.3389/fnins.2023.1168888-
dcterms.abstractDeaf-mutes face many difficulties in daily interactions with hearing people through spoken language. Sign language is an important way of expression and communication for deaf-mutes. Therefore, breaking the communication barrier between the deaf-mute and hearing communities is significant for facilitating their integration into society. To help them integrate into social life better, we propose a multimodal Chinese sign language (CSL) gesture interaction framework based on social robots. The CSL gesture information including both static and dynamic gestures is captured from two different modal sensors. A wearable Myo armband and a Leap Motion sensor are used to collect human arm surface electromyography (sEMG) signals and hand 3D vectors, respectively. Two modalities of gesture datasets are preprocessed and fused to improve the recognition accuracy and to reduce the processing time cost of the network before sending it to the classifier. Since the input datasets of the proposed framework are temporal sequence gestures, the long-short term memory recurrent neural network is used to classify these input sequences. Comparative experiments are performed on an NAO robot to test our method. Moreover, our method can effectively improve CSL gesture recognition accuracy, which has potential applications in a variety of gesture interaction scenarios not only in social robots.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationFrontiers in neuroscience, 2023, v. 17, 1168888-
dcterms.isPartOfFrontiers in neuroscience-
dcterms.issued2023-
dc.identifier.scopus2-s2.0-85153533118-
dc.identifier.artn1168888-
dc.description.validate202403 bcvc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextStartup Foundation of Chongqing Technology and Business University; German Academic Exchange Service of Germany; PolyU Start-upen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
fnins-17-1168888.pdf4.95 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

11
Citations as of Jul 7, 2024

Downloads

4
Citations as of Jul 7, 2024

SCOPUSTM   
Citations

2
Citations as of Jul 4, 2024

WEB OF SCIENCETM
Citations

2
Citations as of Jul 4, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.