Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/81341
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorWang, JC-
dc.creatorShen, L-
dc.creatorQiao, WF-
dc.creatorDai, YS-
dc.creatorLi, ZL-
dc.date.accessioned2019-09-20T00:55:07Z-
dc.date.available2019-09-20T00:55:07Z-
dc.identifier.urihttp://hdl.handle.net/10397/81341-
dc.language.isoenen_US
dc.publisherMolecular Diversity Preservation International (MDPI)en_US
dc.rights© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Wang, J.; Shen, L.; Qiao, W.; Dai, Y.; Li, Z. Deep Feature Fusion with Integration of Residual Connection and Attention Model for Classification of VHR Remote Sensing Images. Remote Sens. 2019, 11, 1617, 1-22 is available at https://dx.doi.org/10.3390/rs11131617en_US
dc.subjectFully convolutional network (FCN)en_US
dc.subjectVery-high-resolution (VHR) image classificationen_US
dc.subjectResidual connectionen_US
dc.subjectAttention modelen_US
dc.subjectFeature fusionen_US
dc.titleDeep feature fusion with integration of residual connection and attention model for classification of VHR remote sensing imagesen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage1-
dc.identifier.epage22-
dc.identifier.volume11-
dc.identifier.issue13-
dc.identifier.doi10.3390/rs11131617-
dcterms.abstractThe classification of very-high-resolution (VHR) remote sensing images is essential in many applications. However, high intraclass and low interclass variations in these kinds of images pose serious challenges. Fully convolutional network (FCN) models, which benefit from a powerful feature learning ability, have shown impressive performance and great potential. Nevertheless, only classification results with coarse resolution can be obtained from the original FCN method. Deep feature fusion is often employed to improve the resolution of outputs. Existing strategies for such fusion are not capable of properly utilizing the low-level features and considering the importance of features at different scales. This paper proposes a novel, end-to-end, fully convolutional network to integrate a multiconnection ResNet model and a class-specific attention model into a unified framework to overcome these problems. The former fuses multilevel deep features without introducing any redundant information from low-level features. The latter can learn the contributions from different features of each geo-object at each scale. Extensive experiments on two open datasets indicate that the proposed method can achieve class-specific scale-adaptive classification results and it outperforms other state-of-the-art methods. The results were submitted to the International Society for Photogrammetry and Remote Sensing (ISPRS) online contest for comparison with more than 50 other methods. The results indicate that the proposed method (ID: SWJ_2) ranks #1 in terms of overall accuracy, even though no additional digital surface model (DSM) data that were offered by ISPRS were used and no postprocessing was applied.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationRemote sensing, 1 July 2019, v. 11, no. 13, 1617, p. 1-22-
dcterms.isPartOfRemote sensing-
dcterms.issued2019-
dc.identifier.isiWOS:000477049000108-
dc.identifier.eissn2072-4292-
dc.identifier.artn1617-
dc.description.validate201909 bcrc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Wang_Deep_Feature_Fusion.pdf2.7 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

122
Last Week
1
Last month
Citations as of Mar 24, 2024

Downloads

100
Citations as of Mar 24, 2024

WEB OF SCIENCETM
Citations

28
Citations as of Mar 28, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.