Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/81643
PIRA download icon_1.1View/Download Full Text
Title: Spatio-temporal data fusion for satellite images using hopfield neural network
Authors: Fung, CH 
Wong, MS 
Chan, PW
Issue Date: 2019
Source: Remote sensing, Sept. 2019, v. 11, no. 18, 2077, p. 1-21
Abstract: Spatio-temporal data fusion refers to the technique of combining high temporal resolution from coarse satellite images and high spatial resolution from fine satellite images. However, data availability remains a major limitation in algorithm development. Existing spatio-temporal data fusion algorithms require at least one known image pair between the fine and coarse resolution image. However, data which come from two different satellite platforms do not necessarily have an overlap in their overpass times, hence restricting the application of spatio-temporal data fusion. In this paper, a new algorithm named Hopfield Neural Network SPatio-tempOral daTa fusion model (HNN-SPOT) is developed by utilizing the optimization concept in the Hopfield neural network (HNN) for spatio-temporal image fusion. The algorithm derives a synthesized fine resolution image from a coarse spatial resolution satellite image (similar to downscaling), with the use of one fine resolution image taken on an arbitrary date and one coarse image taken on a predicted date. The HNN-SPOT particularly addresses the problem when the fine resolution and coarse resolution images are acquired from different satellite overpass times over the same geographic extent. Both simulated datasets and real datasets over Hong Kong and Australia have been used in the evaluation of HNN-SPOT. Results showed that HNN-SPOT was comparable with an existing fusion algorithm, the spatial and temporal adaptive reflectance fusion model (STARFM). HNN-SPOT assumes consistent spatial structure for the target area between the date of data acquisition and the prediction date. Therefore, it is more applicable to geographical areas with little or no land cover change. It is shown that HNN-SPOT can produce accurate fusion results with >90% of correlation coefficient over consistent land covers. For areas that have undergone land cover changes, HNN-SPOT can still produce a prediction about the outlines and the tone of the features, if they are large enough to be recorded in the coarse resolution image at the prediction date. HNN-SPOT provides a relatively new approach in spatio-temporal data fusion, and further improvements can be made by modifying or adding new goals and constraints in its HNN architecture. Owing to its lower demand for data prerequisites, HNN-SPOT is expected to increase the applicability of fine-scale applications in remote sensing, such as environmental modeling and monitoring.
Keywords: Spatio-temporal data fusion
Hopfield neural network
Satellite images
Publisher: Molecular Diversity Preservation International (MDPI)
Journal: Remote sensing 
EISSN: 2072-4292
DOI: 10.3390/rs11182077
Rights: © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
The following publication Fung, C.H.; Wong, M.S.; Chan, P.W. Spatio-Temporal Data Fusion for Satellite Images Using Hopfield Neural Network. Remote Sens. 2019, 11, 2077, 1-21 is available at https://dx.doi.org/10.3390/rs11182077
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Fung_Spatio-Temporal_Data_Fusion.pdf2.73 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

62
Citations as of May 15, 2022

Downloads

63
Citations as of May 15, 2022

SCOPUSTM   
Citations

13
Citations as of May 12, 2022

WEB OF SCIENCETM
Citations

9
Citations as of May 19, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.