Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/88627
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorFang, B-
dc.creatorKou, R-
dc.creatorPan, L-
dc.creatorChen, PF-
dc.date.accessioned2020-12-22T01:06:24Z-
dc.date.available2020-12-22T01:06:24Z-
dc.identifier.urihttp://hdl.handle.net/10397/88627-
dc.language.isoenen_US
dc.publisherMolecular Diversity Preservation International (MDPI)en_US
dc.rights© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Fang, B.; Kou, R.; Pan, L.; Chen, P. Category-Sensitive Domain Adaptation for Land Cover Mapping in Aerial Scenes. Remote Sens. 2019, 11, 2631 is available at https://dx.doi.org/10.3390/rs11222631en_US
dc.subjectDomain adaptationen_US
dc.subjectLand cover mappingen_US
dc.subjectAerial imagesen_US
dc.subjectAdversarial learningen_US
dc.subjectGeometry-Consistencyen_US
dc.subjectCo-Trainingen_US
dc.titleCategory-sensitive domain adaptation for land cover mapping in aerial scenesen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1-
dc.identifier.epage24-
dc.identifier.volume11-
dc.identifier.issue22-
dc.identifier.doi10.3390/rs11222631-
dcterms.abstractSince manually labeling aerial images for pixel-level classification is expensive and time-consuming, developing strategies for land cover mapping without reference labels is essential and meaningful. As an efficient solution for this issue, domain adaptation has been widely utilized in numerous semantic labeling-based applications. However, current approaches generally pursue the marginal distribution alignment between the source and target features and ignore the category-level alignment. Therefore, directly applying them to land cover mapping leads to unsatisfactory performance in the target domain. In our research, to address this problem, we embed a geometry-consistent generative adversarial network (GcGAN) into a co-training adversarial learning network (CtALN), and then develop a category-sensitive domain adaptation (CsDA) method for land cover mapping using very-high-resolution (VHR) optical aerial images. The GcGAN aims to eliminate the domain discrepancies between labeled and unlabeled images while retaining their intrinsic land cover information by translating the features of the labeled images from the source domain to the target domain. Meanwhile, the CtALN aims to learn a semantic labeling model in the target domain with the translated features and corresponding reference labels. By training this hybrid framework, our method learns to distill knowledge from the source domain and transfers it to the target domain, while preserving not only global domain consistency, but also category-level consistency between labeled and unlabeled images in the feature space. The experimental results between two airborne benchmark datasets and the comparison with other state-of-the-art methods verify the robustness and superiority of our proposed CsDA.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationRemote sensing, 2 . 2019, , v. 11, no. 22, 2631, p. 1-24-
dcterms.isPartOfRemote sensing-
dcterms.issued2019-11-02-
dc.identifier.isiWOS:000502284300036-
dc.identifier.eissn2072-4292-
dc.identifier.artn2631-
dc.description.validate202012 bcrc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryCCen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Fang_Category-Sensitive_Domain_Adaptation.pdf8.46 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

84
Last Week
0
Last month
Citations as of May 11, 2025

Downloads

26
Citations as of May 11, 2025

SCOPUSTM   
Citations

30
Citations as of Jun 21, 2024

WEB OF SCIENCETM
Citations

29
Citations as of May 15, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.