Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/116139
Title: LLM-assisted VR-enabled intuitive robotic control systematic approach for human-centric smart manufacturing
Authors: Wan, Ke
Degree: M.Phil.
Issue Date: 2025
Abstract: With the shift towards personalized manufacturing and the growing demand for high customization following the introduction of the Industry 5.0 concept, traditional predefined programming approaches have become inadequate for meeting the increasingly complex demands of modern product manufacturing processes. Preprogrammed robots, which rely on rigid and fixed action sequences, face significant challenges in complex and dynamic production environments. While these robots are designed to perform repetitive tasks with precision, they lack the flexibility to adapt to changing conditions or unexpected variations in the production process. Consequently, any deviation from the programmed task necessitates manual reprogramming, resulting in limited effectiveness and increased operational costs. To address these limitations, human-centric manufacturing has emerged as a solution, enabling seamless integration of human intelligence with the precision and efficiency of robots. Unlike traditional preprogrammed robots, human-centric manufacturing systems are highly adaptable, responding dynamically to the variability and unpredictability inherent in customized manufacturing environments.
In recent years, significant research has focused on human-centric manufacturing, with an emphasis on enhancing interaction efficiency and integrating artificial intelligence (AI) for decision-making. These studies have investigated how collaborative robots can serve as more efficient interaction platforms and adapt to dynamic environments. However, despite these advancements, several notable research gaps persist. For example, while AI has been incorporated into certain planning processes, the potential of large language models (LLMs) for comprehensive robot task planning remains underexplored. Furthermore, existing research often falls short in developing intuitive robot control systems, leading to high learning cost and diminished user experience and precision for human operators. To address these challenges, this thesis aims to propose solutions focusing on three critical aspects of human-centric manufacturing scenarios: perception, planning, and execution.
In the investigation of robot task planning, a multi-modal pre-trained LLM is leveraged to seamlessly translate high-level human instructions into actionable robot commands, enhancing the interaction between human operators and robotic systems (Chapter 3). This approach is structured across three integral layers: task decomposer, motion descriptor, and robot code generator. Each layer is meticulously designed with structured prompts, including detailed templates and specific rules to ensure the generation of precise and effective outputs by the LLM agent.
In addition, a VR-based robotic control system is developed to enhance intuitive control and immersive visual feedback in robotic manipulation for teleoperation tasks (Chapter 4). The system leverages immersive VR interfaces to provide operators with real-time feedback and control mechanisms over robot actions, enabling seamless interaction in complex environments. By integrating intuitive VR input methods, immersive visual perception approaches, as well as seamless data exchange between human operator and onsite robot manipulator, the teleoperation system allows for manipulation of objects in complex manufacturing scenarios, facilitating efficient task execution.
Pages: xi, 105 pages : color illustrations
Appears in Collections:Thesis

Show full item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.