Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/88013
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Building and Real Estate | - |
dc.contributor | Department of Civil and Environmental Engineering | - |
dc.creator | Chen, L | - |
dc.creator | Lin, KJ | - |
dc.creator | Siu, MFF | - |
dc.creator | Wang, Y | - |
dc.creator | Chan, PCA | - |
dc.creator | Lau, CFD | - |
dc.date.accessioned | 2020-09-09T00:54:46Z | - |
dc.date.available | 2020-09-09T00:54:46Z | - |
dc.identifier.isbn | 978-962-367-821-6 | - |
dc.identifier.uri | http://hdl.handle.net/10397/88013 | - |
dc.description | Also in "Proceedings of the CIB World Building Congress 2019 : Constructing Smart Cities, the Hong Kong Polytechnic University, Hong Kong, 17-21 June, 2019, p. [1467-1477] (online version) | en_US |
dc.language.iso | en | en_US |
dc.rights | Posted with permission. | en_US |
dc.title | Classification of construction trade and quantification of work efficiency using posture recognitions and deep neural networks | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 1270 | - |
dc.identifier.epage | 1281 | - |
dcterms.abstract | In construction, the planners always estimate the work productivity and work efficiency of limited workers on site for delivering the projects on time. Time study is traditionally used for analysing the worker' performance to derive the productivity and efficiency. However, this manual method is too tedious and time-consuming. In Hong Kong, video cameras are always installed on construction sites to record site operations and workers' workflows. Site videos are always archived and provided solid evidence in case of construction site accidents. The videos captured tons of data which can be of better used for benchmarking worker productivity and work efficiency. Although the research endeavors advanced object recognition technology to automate the productivity analysis using modern computers, its applications of productivity analysis are highly limited. In this research, we proposed a novel approach for video-based productivity analysis by combining the use of posture recognition and deep neural networks. First, a prediction model using deep neural network and transfer learning is calibrated. Next, a knowledge base of posture information of specialty trades is developed by capturing actual postures of known specialty trades. Then, driven by the knowledge base, the workers of unknown trades are smartly classified by the computers using posture recognitions (instead of face recognition) and deep neural networks such that the work efficiency of particular trades can be determined. To demonstrate the application of the proposed method, a practical case study of a housing project in Hong Kong is used. The scope of the study is limited to one trade of steel bar bender and fixer. The computational effectiveness of using the new approach is reported. In conclusion, this new approach outperforms the traditional one in terms of result reliability and time efficiency. | - |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Proceedings of the CIB World Building Congress 2019 : Constructing Smart Cities, the Hong Kong Polytechnic University, Hong Kong, 17-21 June, 2019, p. [1270-1281] (online version) | - |
dcterms.issued | 2019 | - |
dc.relation.conference | CIB World Building Congress | - |
dc.description.validate | 202009 bcrc | - |
dc.description.oa | Version of Record | en_US |
dc.identifier.FolderNumber | OA_Others | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | Publisher permission | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
n13 Chen_Classification_Construction_Trade.pdf | 723.65 kB | Adobe PDF | View/Open |
Page views
168
Last Week
0
0
Last month
Citations as of Apr 13, 2025
Downloads
58
Citations as of Apr 13, 2025

Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.