Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/118029
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Mechanical Engineering-
dc.creatorLee, HY-
dc.creatorZhou, P-
dc.creatorDuan, A-
dc.creatorMa, W-
dc.creatorYang, C-
dc.creatorNavarro-Alarcon, D-
dc.date.accessioned2026-03-12T01:03:03Z-
dc.date.available2026-03-12T01:03:03Z-
dc.identifier.issn0736-5845-
dc.identifier.urihttp://hdl.handle.net/10397/118029-
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.rights© 2026 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC license ( http://creativecommons.org/licenses/by-nc/4.0/ ).en_US
dc.rightsThe following publication Lee, H.-Y., Zhou, P., Duan, A., Ma, W., Yang, C., & Navarro-Alarcon, D. (2026). Non-prehensile tool-object manipulation by integrating LLM-based planning and manoeuvrability-driven controls. Robotics and Computer-Integrated Manufacturing, 100, 103231 is available at https://doi.org/10.1016/j.rcim.2026.103231.en_US
dc.subjectHuman–robot collaborationen_US
dc.subjectLarge Language Models (LLMs)en_US
dc.subjectSymbolic planningen_US
dc.titleNon-prehensile tool-object manipulation by integrating LLM-based planning and manoeuvrability-driven controlsen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume100-
dc.identifier.doi10.1016/j.rcim.2026.103231-
dcterms.abstractThe ability to wield tools was once considered exclusive to human intelligence, but it is now known that many other animals, like crows, possess this capability. Yet, robotic systems still fall short of matching biological dexterity. In this paper, we investigate the use of Large Language Models (LLMs), tool affordances, and object manoeuvrability for non-prehensile tool-based manipulation tasks. Our novel method leverages LLMs based on scene information and natural language instructions to enable symbolic task planning for tool-object manipulation. This approach allows the system to convert a human language sentence into a sequence of feasible motion functions. We have developed a novel manoeuvrability-driven controller using a new tool affordance model derived from visual feedback. This controller helps guide the robot’s tool utilization and manipulation actions, even within confined areas, using a stepping incremental approach. The proposed methodology is evaluated with experiments to prove its effectiveness under various manipulation scenarios.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationRobotics and computer - integrated manufacturing, Aug. 2026, v. 100, 103231-
dcterms.isPartOfRobotics and computer - integrated manufacturing-
dcterms.issued2026-08-
dc.identifier.scopus2-s2.0-105027635194-
dc.identifier.eissn1879-2537-
dc.identifier.artn103231-
dc.description.validate202603 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_TAen_US
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work is supported in part by the Research Grants Council of Hong Kong under grant C4042-23GF, and in part by the National Natural Science Foundation of China (NSFC) under Grant No. 62403211.en_US
dc.description.pubStatusPublisheden_US
dc.description.TAElsevier (2026)en_US
dc.description.oaCategoryTAen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
1-s2.0-S0736584526000116-main.pdf3.5 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.