Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/90974
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informatics-
dc.creatorZhang, H-
dc.creatorSun, Y-
dc.creatorShi, W-
dc.creatorGuo, D-
dc.creatorZheng, N-
dc.date.accessioned2021-09-03T02:35:48Z-
dc.date.available2021-09-03T02:35:48Z-
dc.identifier.urihttp://hdl.handle.net/10397/90974-
dc.language.isoenen_US
dc.publisherAssociazione Italiana di Telerilevamentoen_US
dc.rights© 2021 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.en_US
dc.rightsThe following publication Zhang, H., Sun, Y., Shi, W., Guo, D., & Zheng, N. (2021). An object-based spatiotemporal fusion model for remote sensing images. European Journal of Remote Sensing, 54(1), 86-101 is available at https://doi.org/10.1080/22797254.2021.1879683en_US
dc.subjectLinear injectionen_US
dc.subjectNeighborhood informationen_US
dc.subjectSegmentationen_US
dc.subjectSpatiotemporal fusionen_US
dc.titleAn object-based spatiotemporal fusion model for remote sensing imagesen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage86-
dc.identifier.epage101-
dc.identifier.volume54-
dc.identifier.issue1-
dc.identifier.doi10.1080/22797254.2021.1879683-
dcterms.abstractSpatiotemporal fusion technique can combine the advantages of temporal resolution and spatial resolution of different images to achieve continuous monitoring for the Earth’s surface, which is a feasible solution to resolve the trade-off between the temporal and spatial resolutions of remote sensing images. In this paper, an object-based spatiotemporal fusion model (OBSTFM) is proposed to produce spatiotemporally consistent data, especially in areas experiencing non-shape changes (including phenology changes and land cover changes without shape changes). Considering different changes that might occur in different regions, multi-resolution segmentation is first employed to produce segmented objects, and then a linear injection model is introduced to produce preliminary prediction. In addition, a new optimized strategy to select similar pixels is developed to obtain a more accurate prediction. The performance of proposed OBSTFM is validated using two remotely sensed dataset experiencing phenology changes in the heterogeneous area and land cover type changes, experimental results show that the proposed method is advantageous in such areas with non-shape changes, and has satisfactory robustness and reliability in blending large-scale abrupt land cover changes. Consequently, OBSTFM has great potential for monitoring highly dynamic landscapes.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationEuropean journal of remote sensing, 2021, v. 54, no. 1, p. 86-101-
dcterms.isPartOfEuropean journal of remote sensing-
dcterms.issued2021-
dc.identifier.scopus2-s2.0-85100701661-
dc.identifier.eissn1129-8596-
dc.description.validate202109 bcvc-
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_Scopus/WOSen_US
dc.description.pubStatusPublisheden_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
An object based spatiotemporal fusion model for remote sensing images.pdf15.93 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

46
Last Week
0
Last month
Citations as of May 19, 2024

Downloads

21
Citations as of May 19, 2024

SCOPUSTM   
Citations

18
Citations as of May 16, 2024

WEB OF SCIENCETM
Citations

17
Citations as of May 16, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.