Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/90628
PIRA download icon_1.1View/Download Full Text
Title: A new cross-fusion method to automatically determine the optimal input image pairs for NDVI spatiotemporal data fusion
Authors: Chen, Y
Cao, R
Chen, J
Zhu, X 
Zhou, J
Wang, G
Shen, M
Chen, X
Yang, W
Issue Date: Jul-2020
Source: IEEE transactions on geoscience and remote sensing, July 2020, v. 58, no. 7, 9013055, p. 5179-5194
Abstract: Spatiotemporal data fusion is a methodology to generate images with both high spatial and temporal resolution. Most spatiotemporal data fusion methods generate the fused image at a prediction date based on pairs of input images from other dates. The performance of spatiotemporal data fusion is greatly affected by the selection of the input image pair. There are two criteria for selecting the input image pair: the 'similarity' criterion, in which the image at the base date should be as similar as possible to that at the prediction date, and the 'consistency' criterion, in which the coarse and fine images at the base date should be consistent in terms of their radiometric characteristics and imaging geometry. Unfortunately, the 'consistency' criterion has not been quantitatively considered by previous selection strategies. We thus develop a novel method (called 'cross-fusion') to address the issue of the determination of the base image pair. The new method first chooses several candidate input image pairs according to the 'similarity' criterion and then takes the 'consistency' criterion into account by employing all of the candidate input image pairs to implement spatiotemporal data fusion between them. We applied the new method to MODIS-Landsat Normalized Difference Vegetation Index (NDVI) data fusion. The results show that the cross-fusion method performs better than four other selection strategies, with lower average absolute difference (AAD) values and higher correlation coefficients in various vegetated regions including a deciduous forest in Northeast China, an evergreen forest in South China, cropland in North China Plain, and grassland in the Tibetan Plateau. We simulated scenarios for the inconsistency between MODIS and Landsat data and found that the simulated inconsistency is successfully quantified by the new method. In addition, the cross-fusion method is less affected by cloud omission errors. The fused NDVI time-series data generated by the new method tracked various vegetation growth trajectories better than previous selection strategies. We expect that the cross-fusion method can advance practical applications of spatiotemporal data fusion technology.
Keywords: Landsat normalized difference vegetation index (NDVI)
MODIS-Landsat
NDVI time series
Spatiotemporal fusion
VIIRS NDVI
Publisher: Institute of Electrical and Electronics Engineers
Journal: IEEE transactions on geoscience and remote sensing 
ISSN: 0196-2892
EISSN: 1558-0644
DOI: 10.1109/TGRS.2020.2973762
Rights: © 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications/rights/index.html for more information.
The following publication Y. Chen et al., "A New Cross-Fusion Method to Automatically Determine the Optimal Input Image Pairs for NDVI Spatiotemporal Data Fusion," in IEEE Transactions on Geoscience and Remote Sensing, vol. 58, no. 7, pp. 5179-5194, July 2020 is available at https://dx.doi.org/10.1109/TGRS.2020.2973762
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
2366_TGRS_Cross-fusion.pdfPre-Published version2.57 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

74
Last Week
0
Last month
Citations as of Mar 24, 2024

Downloads

24
Citations as of Mar 24, 2024

SCOPUSTM   
Citations

32
Citations as of Mar 28, 2024

WEB OF SCIENCETM
Citations

31
Citations as of Mar 28, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.