Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/117800
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorMainland Development Office-
dc.contributorOtto Poon Charitable Foundation Smart Cities Research Institute-
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorGao, X-
dc.creatorShi, W-
dc.creatorZhang, M-
dc.creatorWang, L-
dc.date.accessioned2026-03-05T07:56:32Z-
dc.date.available2026-03-05T07:56:32Z-
dc.identifier.issn1939-1404-
dc.identifier.urihttp://hdl.handle.net/10397/117800-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2025 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/en_US
dc.rightsThe following publication X. Gao, W. Shi, M. Zhang and L. Wang, "DAFDM: A Discerning Deep Learning Model for Active Fire Detection Based on Landsat-8 Imagery," in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 18, pp. 7982-8000, 2025 is available at https://doi.org/10.1109/JSTARS.2025.3545114.en_US
dc.subjectActive fire (AF) detectionen_US
dc.subjectConvolutional neural network (CNN)en_US
dc.subjectDeep learning (DL)en_US
dc.subjectLand surface temperature (LST)en_US
dc.subjectLandsat-8en_US
dc.subjectRemote sensingen_US
dc.titleDAFDM : a discerning deep learning model for active fire detection based on Landsat-8 imageryen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage7982-
dc.identifier.epage8000-
dc.identifier.volume18-
dc.identifier.doi10.1109/JSTARS.2025.3545114-
dcterms.abstractMonitoring active fire (AF) utilizing remote sensing imagery provides critical support for fire rescue and environmental protection. Traditional methods for detecting AFs rely on the statistical analysis of AF radiance and background features. However, these algorithms are resource-intensive to develop and exhibit limited adaptability, particularly in distinguishing AF from interference pixels. Deep learning (DL) technologies, which can extract deep features from images, offer a new solution for efficiently detecting AF. This article proposes an AF detection model based on convolutional neural networks, named DAFDM. By integrating multilayer features through an enhanced feature processing module, the model produces high-quality AF information, accurately detecting AF from the background. Given the presence of uncorrected false alarms in the training labels, it is challenging for DL models to distinguish interference pixels, we construct a Landsat-8 dataset encompassing various fire types and interference objects, with precise labels. Comparing several architectures, we find that only U-Net type models can discern the AF boundary pixels fully and accurately. The proposed method outperforms other AF detection algorithms, achieving IoU and F1-score of 87.28% and 93.21%, respectively. Experimental results demonstrate that DAFDM possesses robust generalization capability in distinguishing interference pixels. The incorporation of land surface temperature as auxiliary data further improves DAFDM's performance, with interpretability methods employed to elucidate the impact of input data on predictions. This method is anticipated to further contribute to AF monitoring and wildfire development pattern analysis.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE journal of selected topics in applied earth observations and remote sensing, 2025, v. 18, p. 7982-8000-
dcterms.isPartOfIEEE journal of selected topics in applied earth observations and remote sensing-
dcterms.issued2025-
dc.identifier.scopus2-s2.0-85219503015-
dc.identifier.eissn2151-1535-
dc.description.validate202603 bcch-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported in part by Shenzhen Park of Hetao Shenzhen-Hong Kong Science and Technology Innovation Cooperation Zone (Theories for Spatiotemporal Intelligence and Reliable Data Analysis, Project ID: HZOSWS-KCCYB-2024058), in part by Otto Poon Charitable Foundation Smart Cities Research Institute, the Hong Kong Polytechnic University (Work Program: CD06), and in part by The Hong Kong Polytechnic University (U-ZECR).en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Gao_DAFDM_Discerning_Deep.pdf5.39 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.