Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/112641
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineering-
dc.creatorMei, SH-
dc.creatorLian, JW-
dc.creatorWang, XF-
dc.creatorSu, YR-
dc.creatorMa, MY-
dc.creatorChau, LP-
dc.date.accessioned2025-04-24T00:28:16Z-
dc.date.available2025-04-24T00:28:16Z-
dc.identifier.urihttp://hdl.handle.net/10397/112641-
dc.language.isoenen_US
dc.publisherAmerican Association for the Advancement of Science (AAAS)en_US
dc.rightsCopyright © 2024 Shaohui Mei et al. Exclusive licensee Aerospace Information Research Institute, Chinese Academy of Sciences. Distributed under a Creative Commons Attribution License 4.0 (CC BY 4.0) (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Shaohui Mei, Jiawei Lian, Xiaofei Wang, Yuru Su, Mingyang Ma, Lap-Pui Chau. A Comprehensive Study on the Robustness of Deep Learning-Based Image Classification and Object Detection in Remote Sensing: Surveying and Benchmarking. J Remote Sens. 2024;4:0219 is available at https://doi.org/10.34133/remotesensing.0219.en_US
dc.titleA comprehensive study on the robustness of deep learning-based image classification and object detection in remote sensing : surveying and benchmarkingen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume4-
dc.identifier.doi10.34133/remotesensing.0219-
dcterms.abstractDeep neural networks (DNNs) have found widespread applications in interpreting remote sensing (RS) imagery. However, it has been demonstrated in previous works that DNNs are susceptible and vulnerable to different types of noises, particularly adversarial noises. Surprisingly, there has been a lack of comprehensive studies on the robustness of RS tasks, prompting us to undertake a thorough survey and benchmark on the robustness of DNNs in RS. This manuscript conducts a comprehensive study of both the natural robustness and adversarial robustness of DNNs in RS tasks. Specifically, we systematically and extensively survey the robustness of DNNs from various perspectives such as noise type, attack domain, and attacker's knowledge, encompassing typical applications such as object detection and image classification. Building upon this foundation, we further develop a rigorous benchmark for testing the robustness of DNN-based models, which entails the construction of noised datasets, robustness testing, and evaluation. Under the proposed benchmark, we perform a meticulous and systematic examination of the robustness of typical deep learning algorithms in the context of object detection and image classification applications. Through comprehensive survey and benchmark, we uncover insightful and intriguing findings, which shed light on the relationship between adversarial noise crafting and model training, yielding a deeper understanding of the susceptibility and limitations of various DNN-based models, and providing guidance for the development of more resilient and robust models.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationJournal of remote sensing, 2024, v. 4, 0219-
dcterms.isPartOfJournal of remote sensing-
dcterms.issued2024-
dc.identifier.isiWOS:001329522000001-
dc.identifier.eissn2694-1589-
dc.identifier.artn0219-
dc.description.validate202504 bcrc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of Chinaen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
remotesensing.0219.pdf41.32 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

WEB OF SCIENCETM
Citations

22
Citations as of Dec 18, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.