Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/116644
DC FieldValueLanguage
dc.contributorDepartment of Building and Real Estateen_US
dc.contributorMainland Development Officeen_US
dc.creatorZhang, Jen_US
dc.creatorShuai, Sen_US
dc.creatorWeng, Yen_US
dc.creatorHu, Yen_US
dc.creatorZhang, Men_US
dc.creatorCheng, Men_US
dc.creatorZhang, Gen_US
dc.date.accessioned2026-01-09T02:32:37Z-
dc.date.available2026-01-09T02:32:37Z-
dc.identifier.issn0926-5805en_US
dc.identifier.urihttp://hdl.handle.net/10397/116644-
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.subjectActivity understandingen_US
dc.subjectChain-of-thoughten_US
dc.subjectConstruction monitoringen_US
dc.subjectInertial measurement unit (IMU)en_US
dc.subjectLarge language model (LLM)en_US
dc.titleLarge language model-driven framework for inertial measurement unit-based worker activity recognitionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume180en_US
dc.identifier.doi10.1016/j.autcon.2025.106594en_US
dcterms.abstractInertial Measurement Units (IMUs) are widely used in wearable devices to detect worker activities, but current solutions often require multiple sensors and extensive labeled training data, limiting their practicality and applicability across diverse scenarios. This paper proposes a Large Language Model (LLM)-driven framework that recognizes worker activities from a single head-mounted IMU via unsupervised reasoning. Three meta-event principles are formulated and a video-IMU joint labeling tool is developed to extract meta-event features. An Activity Feature Recognizer is developed to identify motion characteristics, while K-Medoids clustering and autocorrelation functions are employed to quantify activity intensity and periodicity. Building upon these, an LLM-driven Agent network comprising a Supervisor, an Observer, and a Summarizer, is proposed to perform reasoning and activity understanding. Experiments achieved a Hamming distance of 2.62 for meta-event activity feature recognition and a 0.931 acceptance rate for Agent-inferred activity descriptions.en_US
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationAutomation in construction, Dec. 2025, v. 180, 106594en_US
dcterms.isPartOfAutomation in constructionen_US
dcterms.issued2025-12-
dc.identifier.scopus2-s2.0-105018103490-
dc.identifier.eissn1872-7891en_US
dc.identifier.artn106594en_US
dc.description.validate202601 bchyen_US
dc.description.oaNot applicableen_US
dc.identifier.SubFormIDG000669/2025-11-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis research was jointly funded by the National Natural Science Foundation of China [Grant No. 42302322], Science, Technology and Innovation Commission of Shenzhen Municipality [Grant No. JCYJ20240813161904006].en_US
dc.description.pubStatusPublisheden_US
dc.date.embargo2027-12-31en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 2027-12-31
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.