Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/97776
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Rehabilitation Sciencesen_US
dc.contributorDepartment of Computingen_US
dc.creatorTam, WCen_US
dc.creatorFu, EYen_US
dc.creatorLi, Jen_US
dc.creatorPeacock, Ren_US
dc.creatorReneke, Pen_US
dc.creatorNgai, Gen_US
dc.creatorLeong, HVen_US
dc.creatorCleary, Ten_US
dc.creatorHuang, HXen_US
dc.date.accessioned2023-03-20T09:00:28Z-
dc.date.available2023-03-20T09:00:28Z-
dc.identifier.issn0957-4174en_US
dc.identifier.urihttp://hdl.handle.net/10397/97776-
dc.language.isoenen_US
dc.publisherPergamon Pressen_US
dc.rightsThis manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/.en_US
dc.rightsThe following publication Tam, W. C., Fu, E. Y., Li, J., Peacock, R., Reneke, P., Ngai, G., ... & Huang, M. X. (2023). Real-time flashover prediction model for multi-compartment building structures using attention based recurrent neural networks. Expert Systems with Applications, 223, 119899 is available at https://doi.org/10.1016/j.eswa.2023.119899.en_US
dc.subjectFlashover occurrenceen_US
dc.subjectMachine learningen_US
dc.subjectReal-time predictionen_US
dc.subjectRealistic fire and opening conditionsen_US
dc.subjectBenchmark modelsen_US
dc.titleReal-time flashover prediction model for multi-compartment building structures using attention based recurrent neural networksen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume223en_US
dc.identifier.doi10.1016/j.eswa.2023.119899en_US
dcterms.abstractThis paper presents the development of an attention based bi-directional gated recurrent unit model, P-Flashv2, for the prediction of potential occurrence of flashover in a traditional 111 m2 single story ranch-style family home. Synthetic temperature data for more than 110 000 fire cases with a wide range of fire and vent opening conditions are collected. Temperature limit to heat detectors is applied to mimic the loss of temperature data in real fire scenarios. P-Flashv2 is shown to be able to make predictions with a maximum lead time of 60 s and its performance is benchmarked against eight different model architectures. Results show that P-Flashv2 has an overall accuracy of ∼ 87.7 % and ∼ 89.5% for flashover predictions with a lead time setting of 30 s and 60 s, respectively. Additional model testing is conducted to assess P-Flashv2 prediction capability in real fire scenarios. Evaluating the model again with full-scale experimental data, P-Flashv2 has an overall prediction accuracy of ∼ 82.7 % and ∼ 85.6 % for cases with the lead time of setting 30 s and 60 s, respectively. Results from this study show that the proposed machine learning based model, P-Flashv2, can be used to facilitate data-driven fire fighting and reduce fire fighter deaths and injuries.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationExpert systems with applications, 1 Aug. 2023, v. 223, 119899en_US
dcterms.isPartOfExpert systems with applicationsen_US
dcterms.issued2023-08-01-
dc.identifier.eissn1873-6793en_US
dc.identifier.artn119899en_US
dc.description.validate202303 bcwwen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera1960-
dc.identifier.SubFormID46208-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Tam_Real-time_flashover_prediction.pdfPre-Published version2.44 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

148
Last Week
16
Last month
Citations as of Aug 17, 2025

SCOPUSTM   
Citations

4
Citations as of Jun 21, 2024

WEB OF SCIENCETM
Citations

14
Citations as of Aug 28, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.