Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/108018
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Building Environment and Energy Engineeringen_US
dc.creatorWang, Zen_US
dc.creatorZhang, Ten_US
dc.creatorHuang, Xen_US
dc.date.accessioned2024-07-23T01:36:21Z-
dc.date.available2024-07-23T01:36:21Z-
dc.identifier.issn0924-669Xen_US
dc.identifier.urihttp://hdl.handle.net/10397/108018-
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.rights© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023en_US
dc.rightsThis version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s10489-023-05231-x.en_US
dc.subjectFire Engineeringen_US
dc.subjectFlame imagesen_US
dc.subjectImage analysisen_US
dc.subjectSemantic segmentationen_US
dc.subjectSmart firefightingen_US
dc.titleExplainable deep learning for image-driven fire calorimetryen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1047en_US
dc.identifier.epage1062en_US
dc.identifier.volume54en_US
dc.identifier.issue1en_US
dc.identifier.doi10.1007/s10489-023-05231-xen_US
dcterms.abstractThe rapid advancement of deep learning and computer vision has driven the intelligent evolution of fire detection, quantification and fighting, although most AI models remain opaque black boxes. This work applies explainable deep learning methods to quantify fire power by flame images and aims to elucidate the underlying mechanism of computer-vision fire calorimetry. The process begins with the utilization of a pre-trained fire segmentation model to create a flame image database in various formats: (1) original RGB, (2) background-free RGB, (3) background-free grey, and (4) background-free binary. This diverse database accounts for factors such as background, color, and brightness. Then, the synthetic database is employed to train and test the fire-calorimetry AI model. Results highlight the dominant role of determining flame area in fire calorimetry, while other factors displaying minimal influence. Enhancing the accuracy of flame segmentation significantly reduces error in computer-vision fire calorimetry to less than 20%. Finally, the study incorporates the Gradient-weighted Class Activation Mapping (Grad-CAM) method to visualize the pixel-level contribution to fire image identification. This research deepens the understanding of vision-based fire calorimetry and providing scientific support for future AI applications in fire monitoring, digital twin, and smart firefighting.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationApplied intelligence, Jan. 2024, v. 54, no. 1, p. 1047-1062en_US
dcterms.isPartOfApplied intelligenceen_US
dcterms.issued2024-01-
dc.identifier.scopus2-s2.0-85180700753-
dc.description.validate202407 bcwhen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera3084b-
dc.identifier.SubFormID49438-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Wang_Explainable_Deep_Learning.pdfPre-Published version3.34 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

77
Citations as of Apr 14, 2025

Downloads

1
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

13
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

5
Citations as of Jan 9, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.