Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/88627
PIRA download icon_1.1View/Download Full Text
Title: Category-sensitive domain adaptation for land cover mapping in aerial scenes
Authors: Fang, B
Kou, R
Pan, L
Chen, PF 
Issue Date: 2-Nov-2019
Source: Remote sensing, 2 . 2019, , v. 11, no. 22, 2631, p. 1-24
Abstract: Since manually labeling aerial images for pixel-level classification is expensive and time-consuming, developing strategies for land cover mapping without reference labels is essential and meaningful. As an efficient solution for this issue, domain adaptation has been widely utilized in numerous semantic labeling-based applications. However, current approaches generally pursue the marginal distribution alignment between the source and target features and ignore the category-level alignment. Therefore, directly applying them to land cover mapping leads to unsatisfactory performance in the target domain. In our research, to address this problem, we embed a geometry-consistent generative adversarial network (GcGAN) into a co-training adversarial learning network (CtALN), and then develop a category-sensitive domain adaptation (CsDA) method for land cover mapping using very-high-resolution (VHR) optical aerial images. The GcGAN aims to eliminate the domain discrepancies between labeled and unlabeled images while retaining their intrinsic land cover information by translating the features of the labeled images from the source domain to the target domain. Meanwhile, the CtALN aims to learn a semantic labeling model in the target domain with the translated features and corresponding reference labels. By training this hybrid framework, our method learns to distill knowledge from the source domain and transfers it to the target domain, while preserving not only global domain consistency, but also category-level consistency between labeled and unlabeled images in the feature space. The experimental results between two airborne benchmark datasets and the comparison with other state-of-the-art methods verify the robustness and superiority of our proposed CsDA.
Keywords: Domain adaptation
Land cover mapping
Aerial images
Adversarial learning
Geometry-Consistency
Co-Training
Publisher: Molecular Diversity Preservation International (MDPI)
Journal: Remote sensing 
EISSN: 2072-4292
DOI: 10.3390/rs11222631
Rights: © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
The following publication Fang, B.; Kou, R.; Pan, L.; Chen, P. Category-Sensitive Domain Adaptation for Land Cover Mapping in Aerial Scenes. Remote Sens. 2019, 11, 2631 is available at https://dx.doi.org/10.3390/rs11222631
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Fang_Category-Sensitive_Domain_Adaptation.pdf8.46 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

84
Last Week
0
Last month
Citations as of May 11, 2025

Downloads

26
Citations as of May 11, 2025

SCOPUSTM   
Citations

30
Citations as of Jun 21, 2024

WEB OF SCIENCETM
Citations

29
Citations as of May 15, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.