Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/92596
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Industrial and Systems Engineeringen_US
dc.creatorLi, Sen_US
dc.creatorZheng, Pen_US
dc.creatorFan, Jen_US
dc.creatorWang, Len_US
dc.date.accessioned2022-04-26T06:45:44Z-
dc.date.available2022-04-26T06:45:44Z-
dc.identifier.issn0278-0046en_US
dc.identifier.urihttp://hdl.handle.net/10397/92596-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication S. Li, P. Zheng, J. Fan and L. Wang, "Toward Proactive Human–Robot Collaborative Assembly: A Multimodal Transfer-Learning-Enabled Action Prediction Approach," in IEEE Transactions on Industrial Electronics, vol. 69, no. 8, pp. 8579-8588, Aug. 2022 is available at https://dx.doi.org/10.1109/TIE.2021.3105977.en_US
dc.subjectAction recognitionen_US
dc.subjectHuman-robot collaborationen_US
dc.subjectMultimodal intelligenceen_US
dc.subjectTransfer learningen_US
dc.titleToward proactive human-robot collaborative assembly : a multimodal transfer-learning-enabled action prediction approachen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage8579en_US
dc.identifier.epage8588en_US
dc.identifier.volume69en_US
dc.identifier.issue8en_US
dc.identifier.doi10.1109/TIE.2021.3105977en_US
dcterms.abstractHuman-robot collaborative assembly (HRCA) is vital for achieving high-level flexible automation for mass personalization in today's smart factories. However, existing works in both industry and academia mainly focus on the adaptive robot planning, while seldom consider human operator's intentions in advance. Hence, it hinders the HRCA transition toward a proactive manner. To overcome the bottleneck, this article proposes a multimodal transfer-learning-enabled action prediction approach, serving as the prerequisite to ensure the proactive HRCA. First, a multimodal intelligence-based action recognition approach is proposed to predict ongoing human actions by leveraging the visual stream and skeleton stream with short-time input frames. Second, a transfer-learning-enabled model is adapted to transfer learnt knowledge from daily activities to industrial assembly operations rapidly for online operator intention analysis. Third, a dynamic decision-making mechanism, including robotic decision and motion control, is described to allow mobile robots to assist operators in a proactive manner. Finally, an aircraft bracket assembly task is demonstrated in the laboratory environment, and the comparative study result shows that the proposed approach outperforms other state-of-the-art ones for efficient action prediction.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on industrial electronics, Aug. 2022, v. 69, no. 8, p. 8579-8588en_US
dcterms.isPartOfIEEE transactions on industrial electronicsen_US
dcterms.issued2022-08-
dc.identifier.scopus2-s2.0-85114652119-
dc.identifier.eissn1557-9948en_US
dc.description.validate202204 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera1288, ISE-0205-
dc.identifier.SubFormID44469-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextOthers: Innovation and Technology Commissionen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS56041428-
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Li_Towards_Proactive_Human.pdfPre-Published version3.98 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

89
Last Week
0
Last month
Citations as of May 19, 2024

Downloads

471
Citations as of May 19, 2024

SCOPUSTM   
Citations

54
Citations as of May 17, 2024

WEB OF SCIENCETM
Citations

48
Citations as of May 2, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.