Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/81341
Title: Deep feature fusion with integration of residual connection and attention model for classification of VHR remote sensing images
Authors: Wang, JC 
Shen, L
Qiao, WF
Dai, YS
Li, ZL 
Keywords: Fully convolutional network (FCN)
Very-high-resolution (VHR) image classification
Residual connection
Attention model
Feature fusion
Issue Date: 2019
Publisher: Molecular Diversity Preservation International (MDPI)
Source: Remote sensing, 1 July 2019, v. 11, no. 13, 1617, p. 1-22 How to cite?
Journal: Remote sensing 
Abstract: The classification of very-high-resolution (VHR) remote sensing images is essential in many applications. However, high intraclass and low interclass variations in these kinds of images pose serious challenges. Fully convolutional network (FCN) models, which benefit from a powerful feature learning ability, have shown impressive performance and great potential. Nevertheless, only classification results with coarse resolution can be obtained from the original FCN method. Deep feature fusion is often employed to improve the resolution of outputs. Existing strategies for such fusion are not capable of properly utilizing the low-level features and considering the importance of features at different scales. This paper proposes a novel, end-to-end, fully convolutional network to integrate a multiconnection ResNet model and a class-specific attention model into a unified framework to overcome these problems. The former fuses multilevel deep features without introducing any redundant information from low-level features. The latter can learn the contributions from different features of each geo-object at each scale. Extensive experiments on two open datasets indicate that the proposed method can achieve class-specific scale-adaptive classification results and it outperforms other state-of-the-art methods. The results were submitted to the International Society for Photogrammetry and Remote Sensing (ISPRS) online contest for comparison with more than 50 other methods. The results indicate that the proposed method (ID: SWJ_2) ranks #1 in terms of overall accuracy, even though no additional digital surface model (DSM) data that were offered by ISPRS were used and no postprocessing was applied.
URI: http://hdl.handle.net/10397/81341
EISSN: 2072-4292
DOI: 10.3390/rs11131617
Rights: © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
The following publication Wang, J.; Shen, L.; Qiao, W.; Dai, Y.; Li, Z. Deep Feature Fusion with Integration of Residual Connection and Attention Model for Classification of VHR Remote Sensing Images. Remote Sens. 2019, 11, 1617, 1-22 is available at https://dx.doi.org/10.3390/rs11131617
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Wang_Deep_Feature_Fusion.pdf2.7 MBAdobe PDFView/Open
Access
View full-text via PolyU eLinks SFX Query
Show full item record
PIRA download icon_1.1View/Download Contents

Page view(s)

13
Citations as of Oct 22, 2019

Download(s)

5
Citations as of Oct 22, 2019

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.