Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/90628
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informaticsen_US
dc.creatorChen, Yen_US
dc.creatorCao, Ren_US
dc.creatorChen, Jen_US
dc.creatorZhu, Xen_US
dc.creatorZhou, Jen_US
dc.creatorWang, Gen_US
dc.creatorShen, Men_US
dc.creatorChen, Xen_US
dc.creatorYang, Wen_US
dc.date.accessioned2021-08-04T01:52:17Z-
dc.date.available2021-08-04T01:52:17Z-
dc.identifier.issn0196-2892en_US
dc.identifier.urihttp://hdl.handle.net/10397/90628-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications/rights/index.html for more information.en_US
dc.rightsThe following publication Y. Chen et al., "A New Cross-Fusion Method to Automatically Determine the Optimal Input Image Pairs for NDVI Spatiotemporal Data Fusion," in IEEE Transactions on Geoscience and Remote Sensing, vol. 58, no. 7, pp. 5179-5194, July 2020 is available at https://dx.doi.org/10.1109/TGRS.2020.2973762en_US
dc.subjectLandsat normalized difference vegetation index (NDVI)en_US
dc.subjectMODIS-Landsaten_US
dc.subjectNDVI time seriesen_US
dc.subjectSpatiotemporal fusionen_US
dc.subjectVIIRS NDVIen_US
dc.titleA new cross-fusion method to automatically determine the optimal input image pairs for NDVI spatiotemporal data fusionen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage5179en_US
dc.identifier.epage5194en_US
dc.identifier.volume58en_US
dc.identifier.issue7en_US
dc.identifier.doi10.1109/TGRS.2020.2973762en_US
dcterms.abstractSpatiotemporal data fusion is a methodology to generate images with both high spatial and temporal resolution. Most spatiotemporal data fusion methods generate the fused image at a prediction date based on pairs of input images from other dates. The performance of spatiotemporal data fusion is greatly affected by the selection of the input image pair. There are two criteria for selecting the input image pair: the 'similarity' criterion, in which the image at the base date should be as similar as possible to that at the prediction date, and the 'consistency' criterion, in which the coarse and fine images at the base date should be consistent in terms of their radiometric characteristics and imaging geometry. Unfortunately, the 'consistency' criterion has not been quantitatively considered by previous selection strategies. We thus develop a novel method (called 'cross-fusion') to address the issue of the determination of the base image pair. The new method first chooses several candidate input image pairs according to the 'similarity' criterion and then takes the 'consistency' criterion into account by employing all of the candidate input image pairs to implement spatiotemporal data fusion between them. We applied the new method to MODIS-Landsat Normalized Difference Vegetation Index (NDVI) data fusion. The results show that the cross-fusion method performs better than four other selection strategies, with lower average absolute difference (AAD) values and higher correlation coefficients in various vegetated regions including a deciduous forest in Northeast China, an evergreen forest in South China, cropland in North China Plain, and grassland in the Tibetan Plateau. We simulated scenarios for the inconsistency between MODIS and Landsat data and found that the simulated inconsistency is successfully quantified by the new method. In addition, the cross-fusion method is less affected by cloud omission errors. The fused NDVI time-series data generated by the new method tracked various vegetation growth trajectories better than previous selection strategies. We expect that the cross-fusion method can advance practical applications of spatiotemporal data fusion technology.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on geoscience and remote sensing, July 2020, v. 58, no. 7, 9013055, p. 5179-5194en_US
dcterms.isPartOfIEEE transactions on geoscience and remote sensingen_US
dcterms.issued2020-07-
dc.identifier.scopus2-s2.0-85087458055-
dc.identifier.eissn1558-0644en_US
dc.identifier.artn9013055en_US
dc.description.validate202108 bcvcen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera0993-n15-
dc.identifier.SubFormID2366-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
2366_TGRS_Cross-fusion.pdfPre-Published version2.57 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

111
Last Week
0
Last month
Citations as of Apr 14, 2025

Downloads

74
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

37
Citations as of Sep 12, 2025

WEB OF SCIENCETM
Citations

33
Citations as of Oct 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.