Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115562
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineering-
dc.creatorZheng, Y-
dc.creatorYao, L-
dc.creatorSu, Y-
dc.creatorZhang, Y-
dc.creatorWang, Y-
dc.creatorZhao, S-
dc.creatorZhang, Y-
dc.creatorChau, LP-
dc.date.accessioned2025-10-08T01:16:26Z-
dc.date.available2025-10-08T01:16:26Z-
dc.identifier.issn2731-538X-
dc.identifier.urihttp://hdl.handle.net/10397/115562-
dc.language.isoenen_US
dc.publisherScience in China Pressen_US
dc.rights© The Author(s) 2025en_US
dc.rightsOpen Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.en_US
dc.rightsThe following publication Zheng, Y., Yao, L., Su, Y. et al. A Survey of Embodied Learning for Object-centric Robotic Manipulation. Mach. Intell. Res. 22, 588–626 (2025) is available at https://doi.org/10.1007/s11633-025-1542-8.en_US
dc.subjectAffordance learningen_US
dc.subjectEmbodied learningen_US
dc.subjectPolicy learningen_US
dc.subjectPose estimationen_US
dc.subjectRobotic manipulationen_US
dc.titleA survey of embodied learning for object-centric robotic manipulationen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage588-
dc.identifier.epage626-
dc.identifier.volume22-
dc.identifier.issue4-
dc.identifier.doi10.1007/s11633-025-1542-8-
dcterms.abstractEmbodied learning for object-centric robotic manipulation is a rapidly developing and challenging area in embodied AI. It is crucial for advancing next-generation intelligent robots and has garnered significant interest recently. Unlike data-driven machine learning methods, embodied learning focuses on robot learning through physical interaction with the environment and perceptual feedback, making it especially suitable for robotic manipulation. In this paper, we provide a comprehensive survey of the latest advancements in this field and categorize the existing work into three main branches: 1) Embodied perceptual learning, which aims to predict object pose and affordance through various data representations; 2) Embodied policy learning, which focuses on generating optimal robotic decisions using methods such as reinforcement learning and imitation learning; 3) Embodied task-oriented learning, designed to optimize the robot’s performance based on the characteristics of different tasks in object grasping and manipulation. In addition, we offer an overview and discussion of public datasets, evaluation metrics, representative applications, current challenges, and potential future research directions. A project associated with this survey has been established at https://github.com/RayYoh/OCRM_survey.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationMachine intelligence research, Aug. 2025, v. 22, no. 4, p. 588-626-
dcterms.isPartOfMachine intelligence research-
dcterms.issued2025-08-
dc.identifier.scopus2-s2.0-105008984355-
dc.identifier.eissn2731-5398-
dc.description.validate202510 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_TAen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThe research work was partly conducted in the JC STEM Lab of Machine Learning and Computer Vision funded by The Hong Kong Jockey Club Charities Trust. This work was supported in part by the National Natural Science Foundation of China (No. 62106236).en_US
dc.description.pubStatusPublisheden_US
dc.description.TASpringer Nature (2025)en_US
dc.description.oaCategoryTAen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
s11633-025-1542-8.pdf3.25 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.