Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/92287
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Building and Real Estateen_US
dc.creatorZhang, Yen_US
dc.creatorXiao, Ben_US
dc.creatorAl-Hussein, Men_US
dc.creatorLi, Xen_US
dc.date.accessioned2022-03-11T02:36:56Z-
dc.date.available2022-03-11T02:36:56Z-
dc.identifier.issn0926-5805en_US
dc.identifier.urihttp://hdl.handle.net/10397/92287-
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.rights© 2022 Elsevier B.V. All rights reserved.en_US
dc.rights© 2022. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/.en_US
dc.rightsThe following publication Zhang, Y., Xiao, B., Al-Hussein, M., & Li, X. (2022). Prediction of human restorative experience for human-centered residential architecture design: A non-immersive VR–DOE-based machine learning method. Automation in Construction, 136, 104189 is available at https://dx.doi.org/10.1016/j.autcon.2022.104189.en_US
dc.subjectBuilt environmenten_US
dc.subjectDesign of experimenten_US
dc.subjectHuman-centered designen_US
dc.subjectMachine learningen_US
dc.subjectPrediction modelen_US
dc.subjectResidential designen_US
dc.subjectRestorative experienceen_US
dc.subjectVirtual realityen_US
dc.titlePrediction of human restorative experience for human-centered residential architecture design : a non-immersive VR–DOE-based machine learning methoden_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume136en_US
dc.identifier.doi10.1016/j.autcon.2022.104189en_US
dcterms.abstractNowadays, the topic of restorative experience in built environments has attracted more attention because of the increasing stress levels in modern society. Researchers have sought to identify the architectural features that influence a person’s perceived restorative experience to achieve human-centered architectural designs. However, the relevant design knowledge is unsystematically scattered, making it difficult for designers to interpret information and make informed decisions in practice. This paper explores the feasibility of machine learning in capturing the restorative quality of design alternatives, thereby providing decision support for proactive architectural design analysis. To deal with feature selection and the uncertainty associated with affective modeling, a framework is introduced that integrates design of experiments and machine learning methods. The human restorative experience is assessed within non-immersive VR environments using self-reported psychometric scales. Consequently, general regression neural network is revealed as superior to other machine learning methods in forecasting the restorative experience.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationAutomation in construction, Apr. 2022, v. 136, 104189en_US
dcterms.isPartOfAutomation in constructionen_US
dcterms.issued2022-04-
dc.identifier.scopus2-s2.0-85125665944-
dc.identifier.eissn1872-7891en_US
dc.identifier.artn104189en_US
dc.description.validate202203 bcvcen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera1205-01-
dc.identifier.SubFormID44165-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextAlberta Innovates; National Natural Science Foundation of Chinaen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Zhang_Prediction_Human_Restorative.pdfPre-Published version1.74 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

171
Last Week
3
Last month
Citations as of Nov 9, 2025

Downloads

173
Citations as of Nov 9, 2025

SCOPUSTM   
Citations

19
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

14
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.