Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/109063
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Industrial and Systems Engineeringen_US
dc.creatorWang, Ten_US
dc.creatorFan, Jen_US
dc.creatorZheng, Pen_US
dc.date.accessioned2024-09-19T02:56:16Z-
dc.date.available2024-09-19T02:56:16Z-
dc.identifier.issn1006-5911en_US
dc.identifier.urihttp://hdl.handle.net/10397/109063-
dc.language.isozhen_US
dc.publisherBeijing Advanced Manufacturing Technology Consultation Centeren_US
dc.rightsPosted with permission of the publisher.en_US
dc.rightsThe following publication 王湉, 范峻铭, 郑湃. 基于大语言模型的人机交互移动检测机器人导航方法[J]. 计算机集成制造系统,2024,30(5):1587-1594 is available at https://doi.org/10.13196/j.cims.2024.0139.en_US
dc.subjectHuman-robot interactionen_US
dc.subjectIn-dustry 5.0en_US
dc.subjectLarge language modelen_US
dc.subjectSmart manufacturingen_US
dc.subjectVision and language navigationen_US
dc.titleLarge language model-based approach for human-mobile inspection robot interactive navigationen_US
dc.typeJournal/Magazine Articleen_US
dc.description.otherinformationAuthor name used in this publication: 王湉en_US
dc.description.otherinformationAuthor name used in this publication: 范峻铭en_US
dc.description.otherinformationAuthor name used in this publication: 郑湃en_US
dc.description.otherinformationTitle in Traditional Chinese: 基於大語言模型的人機交互移動檢測機器人導航方法en_US
dc.identifier.spage1587en_US
dc.identifier.epage1594en_US
dc.identifier.volume30en_US
dc.identifier.issue5en_US
dc.identifier.doi10.13196/j.cims.2024.0139en_US
dcterms.abstractIn the manufacturing field, the wide application of mobile robots has become the key to improving operational safety and efficiency. However, most existing robotic systems can only complete predefined navigation tasks, and cannot be adapted to the unstructured environment. To overcome this bottleneck, an interactive navigation method for mobile inspection robots based on large language models was introduced, which replaced operators in conducting inspections within hazardous industrial areas, and to execute complex navigation tasks based on verbal instructions. The High-Resolution Net(HRNet)model was utilized for semantic scene segmentation, integrating the segmentation results into the reconstructed 3D scene mesh during the point cloud fusion phase to create a comprehensive 3D semantic map. A large language model was used to make the robot comprehend human natural language instructions and generate Python code based on the 3D semantic map to complete navigation tasks. A series of experiments had been conducted to validate the effectiveness of the proposed system.en_US
dcterms.abstract在工业制造领域,移动机器人的广泛应用已成为提高作业安全和效率的关键.然而,现有的机器人系统只能完成预定义的导航任务,无法适应非结构化场景.为了突破这一瓶颈,提出一种基于大语言模型(LLM)的人机交互移动检测机器人导航方法,可代替操作人员进入工业环境中的危险区域进行检测,并且可以根据人类自然语言指令完成复杂的导航任务.首先,通过高分辨率网络(HRNet)模型进行场景语义分割,并在点云融合阶段将语义分割结果渲染到重建的三维场景网格模型中,得到三维语义地图;然后利用大语言模型让机器人可以理解人类的自然语言指令,并根据创建的三维语义地图生成Python代码控制机器人完成导航任务.最后,通过一系列非结构化场景下的实验验证了该系统的有效性.en_US
dcterms.accessRightsopen accessen_US
dcterms.alternative基于大语言模型的人机交互移动检测机器人导航方法en_US
dcterms.bibliographicCitation計算機集成製造系統-CIMS (Computer integrated manufacturing systems), 2024, v. 30, no. 5, p. 1587-1594en_US
dcterms.isPartOf計算機集成製造系統-CIMS (Computer integrated manufacturing systems)en_US
dcterms.issued2024-
dc.description.validate202409 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberCDCF_2023-2024-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryPublisher permissionen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Wang_Large_Language_Model-based.pdf511.39 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

174
Citations as of Nov 10, 2025

Downloads

1,191
Citations as of Nov 10, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.