Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/113783
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Mechanical Engineering-
dc.contributorDepartment of Industrial and Systems Engineering-
dc.creatorHuo, S-
dc.creatorDuan, A-
dc.creatorLi, C-
dc.creatorZhou, P-
dc.creatorMa, W-
dc.creatorWang, H-
dc.creatorNavarroAlarcon, D-
dc.date.accessioned2025-06-24T06:37:46Z-
dc.date.available2025-06-24T06:37:46Z-
dc.identifier.urihttp://hdl.handle.net/10397/113783-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.rights© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication S. Huo et al., "Keypoint-Based Planar Bimanual Shaping of Deformable Linear Objects Under Environmental Constraints With Hierarchical Action Framework," in IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 5222-5229, April 2022 is available at https://doi.org/10.1109/LRA.2022.3154842.en_US
dc.subjectAction planningen_US
dc.subjectDeformable linear objectsen_US
dc.subjectHierarchical frameworken_US
dc.subjectRobot manipulationen_US
dc.subjectSynthetic learningen_US
dc.titleKeypoint-based planar bimanual shaping of deformable linear objects under environmental constraints with hierarchical action frameworken_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage5222-
dc.identifier.epage5229-
dc.identifier.volume7-
dc.identifier.issue2-
dc.identifier.doi10.1109/LRA.2022.3154842-
dcterms.abstractThis letter addresses the problem of contact-based manipulation of deformable linear objects (DLOs) towards desired shapes with a dual-arm robotic system. To alleviate the burden of high-dimensional continuous state-action spaces, we model DLOs as kinematic multibody systems via our proposed keypoint encoding network. This novel encoding is trained on a synthetic labeled image dataset without requiring any manual annotations and can be directly transferred to real manipulation scenarios.Our goal-conditioned policy efficiently rearranges the configuration of the DLO based on the keypoints. The proposed hierarchical action framework tackles the manipulation problem in a coarse-to-fine manner (with high-level task planning and low-level motion control) by leveraging two action primitives. The identification of deformation properties is bypassed since the algorithm replans its motion after each bimanual execution. The conducted experimental results reveal that our method achieves high performance in state representation and shaping manipulation of the DLO under environmental constraints.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE robotics and automation letters, Apr. 2022, v. 7, no. 2, p. 5222-5229-
dcterms.isPartOfIEEE robotics and automation letters-
dcterms.issued2022-04-
dc.identifier.scopus2-s2.0-85126301392-
dc.identifier.eissn2377-3766-
dc.description.validate202506 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera3769aen_US
dc.identifier.SubFormID50983en_US
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Huo_Keypoint_Based_Planar.pdfPre-Published version2.47 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

45
Citations as of Feb 9, 2026

Downloads

81
Citations as of Feb 9, 2026

SCOPUSTM   
Citations

36
Citations as of May 8, 2026

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.