Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/101462
| Title: | Wearable acceleration-based action recognition for long-term and continuous activity analysis in construction site | Authors: | Gong, Y Yang, K Seo, J Lee, JG |
Issue Date: | Jul-2022 | Source: | Journal of building engineering, 15 July 2022, v. 52, 104448 | Abstract: | As construction is labor intensive, improvement in labor productivity is essential for achieving better project performance. Activity analysis, a widely adopted approach to improve labor productivity, measures the time spent on specific activities and can identify the root causes of low productivity. The use of automated action recognition using machine learning-based classification based on data (e.g., accelerations) collected from wearable sensors, which addresses the limitations of observation-based activity analysis, has been introduced as an effective means for monitoring and measuring activities. Despite the potential of acceleration-based action recognition, some challenges still need to be addressed from a practical perspective. For example, action categories defined in previous studies tend to be based on either body movements (e.g., walking, lifting, sitting, and standing) or work contexts (e.g., spreading mortar and laying a concrete block), thereby hindering the comprehensive understanding of the diverse nature of activities in construction. The approach needs to be further tested by noisy and continuous acceleration data collected from construction sites to validate its applicability and practicality in actual use. This research proposes a comprehensive hierarchical activity taxonomy (from Level 1 to Level 3) for acceleration-based action recognition by explicitly categorizing diverse construction activities in accordance with body movements and work contexts to address these issues. The proposed taxonomy was tested by using acceleration data collected from 18 construction workers, including formwork and rebar workers, at two construction sites in Hong Kong. Different machine-learning algorithms were implemented on the basis of hierarchically defined construction activities. Testing results indicate a competitive classification performance on Level 1 activities with 98% accuracy on the identification of work and idling. The prediction accuracy of Level 2 classification is also acceptable, with 90.6% and 86.6% classification accuracy for formwork and rebar work, respectively. Level 3 classification, which reaches an accuracy of 77.1% (formwork) and 74.9% (rebar work), requires further improvement before it can be applied in the construction field. The results of this study shall provide practical insights into the application of acceleration-based automated activity analysis for productivity monitoring. | Keywords: | Accelerometer Action recognition Activity taxonomy Automation Productivity Wearable sensor |
Publisher: | Elsevier Ltd | Journal: | Journal of building engineering | EISSN: | 2352-7102 | DOI: | 10.1016/j.jobe.2022.104448 | Rights: | © 2022 Elsevier Ltd. All rights reserved. © 2022. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/ The following publication Gong, Y., et al. (2022). "Wearable acceleration-based action recognition for long-term and continuous activity analysis in construction site." Journal of Building Engineering 52: 104448 is available at https://doi.org/10.1016/j.jobe.2022.104448. |
| Appears in Collections: | Journal/Magazine Article |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Gong_Wearable_Acceleration-based_Action.pdf | Pre-Published version | 1.55 MB | Adobe PDF | View/Open |
Page views
140
Citations as of Nov 10, 2025
Downloads
109
Citations as of Nov 10, 2025
SCOPUSTM
Citations
25
Citations as of Dec 19, 2025
WEB OF SCIENCETM
Citations
17
Citations as of May 15, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



