Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/101462
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Building and Real Estateen_US
dc.creatorGong, Yen_US
dc.creatorYang, Ken_US
dc.creatorSeo, Jen_US
dc.creatorLee, JGen_US
dc.date.accessioned2023-09-18T02:28:09Z-
dc.date.available2023-09-18T02:28:09Z-
dc.identifier.urihttp://hdl.handle.net/10397/101462-
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.rights© 2022 Elsevier Ltd. All rights reserved.en_US
dc.rights© 2022. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/en_US
dc.rightsThe following publication Gong, Y., et al. (2022). "Wearable acceleration-based action recognition for long-term and continuous activity analysis in construction site." Journal of Building Engineering 52: 104448 is available at https://doi.org/10.1016/j.jobe.2022.104448.en_US
dc.subjectAccelerometeren_US
dc.subjectAction recognitionen_US
dc.subjectActivity taxonomyen_US
dc.subjectAutomationen_US
dc.subjectProductivityen_US
dc.subjectWearable sensoren_US
dc.titleWearable acceleration-based action recognition for long-term and continuous activity analysis in construction siteen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume52en_US
dc.identifier.doi10.1016/j.jobe.2022.104448en_US
dcterms.abstractAs construction is labor intensive, improvement in labor productivity is essential for achieving better project performance. Activity analysis, a widely adopted approach to improve labor productivity, measures the time spent on specific activities and can identify the root causes of low productivity. The use of automated action recognition using machine learning-based classification based on data (e.g., accelerations) collected from wearable sensors, which addresses the limitations of observation-based activity analysis, has been introduced as an effective means for monitoring and measuring activities. Despite the potential of acceleration-based action recognition, some challenges still need to be addressed from a practical perspective. For example, action categories defined in previous studies tend to be based on either body movements (e.g., walking, lifting, sitting, and standing) or work contexts (e.g., spreading mortar and laying a concrete block), thereby hindering the comprehensive understanding of the diverse nature of activities in construction. The approach needs to be further tested by noisy and continuous acceleration data collected from construction sites to validate its applicability and practicality in actual use. This research proposes a comprehensive hierarchical activity taxonomy (from Level 1 to Level 3) for acceleration-based action recognition by explicitly categorizing diverse construction activities in accordance with body movements and work contexts to address these issues. The proposed taxonomy was tested by using acceleration data collected from 18 construction workers, including formwork and rebar workers, at two construction sites in Hong Kong. Different machine-learning algorithms were implemented on the basis of hierarchically defined construction activities. Testing results indicate a competitive classification performance on Level 1 activities with 98% accuracy on the identification of work and idling. The prediction accuracy of Level 2 classification is also acceptable, with 90.6% and 86.6% classification accuracy for formwork and rebar work, respectively. Level 3 classification, which reaches an accuracy of 77.1% (formwork) and 74.9% (rebar work), requires further improvement before it can be applied in the construction field. The results of this study shall provide practical insights into the application of acceleration-based automated activity analysis for productivity monitoring.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationJournal of building engineering, 15 July 2022, v. 52, 104448en_US
dcterms.isPartOfJournal of building engineeringen_US
dcterms.issued2022-07-
dc.identifier.scopus2-s2.0-85128237299-
dc.identifier.eissn2352-7102en_US
dc.identifier.artn104448en_US
dc.description.validate202309 bcchen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera2403-
dc.identifier.SubFormID47619-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Gong_Wearable_Acceleration-based_Action.pdfPre-Published version1.55 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

140
Citations as of Nov 10, 2025

Downloads

109
Citations as of Nov 10, 2025

SCOPUSTM   
Citations

25
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

17
Citations as of May 15, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.