Back to results list
Please use this identifier to cite or link to this item:
|Title:||Multisensor fusion for image classification||Authors:||Lau, Wai-leung||Keywords:||Multisensor data fusion.
Hong Kong Polytechnic University -- Dissertations
|Issue Date:||2000||Publisher:||The Hong Kong Polytechnic University||Abstract:||The successive launches of earth observation satellites with various types of sensors have reflected the increasing importance of satellite-based remote sensing. However, due to the physical conflicts, each sensor is exclusive for single type of information. For example, a raw image from spectrally-oriented sensor cannot obtain high spatial resolution and a raw image from optical sensors cannot receive the information from active sensor. Hence, although many types of images are available in the industry, usually only one type of image is used in a single application. Image classification using multispectral image only is one example. Multisensor fusion is a framework to integrate multiple types of images into a single image. A resultant image can inherit some, if not all, characteristics from parent images. Previous researches of multisensor fusion emphasized the development of the algorithm and its benefit to the visual enhancement. The effect of fusion has rarely been investigated. In particular the effect on the most common application of remote sensing - classification - has received little attention. The objective of this research therefore is to investigate the improvement of classification by applying the multisensor fusion to integrate various types of image data. To accomplish this objective, five different types of images: SPOT multispectral (XS), SPOT panchromatic (PAN), scanned aerial photo (AERIAL), JERS optical and near infrared (OVN), and JERS SAR (SAR) images, were obtained to initial the fused images for the classification. Three pairs of images (XS + PAN, XS + AERIAL and OVN + SAR) were cohered by three common fusion approaches: intensity-hue-saturation (HIS), principal component analysis (PCA) and high pass filtering (HPF). To evaluate the influences of classification under multiple levels, the differently fused images were then classified for natural land cover and cultural land use features by the maximum likelihood, neural network and contextual classification approaches.
To assess the impacted classification in terms of thematic accuracy, three measures: confusion matrix, Kappa coefficient and posterior probability were obtained. Subject to the results of assessment, a comparative analysis estimated the degree of influences of classification by using the multisensor fusion. The F-test by R. A. Fisher is obtained to evaluate the reliability of result during the analysis. To associate the classification with fusion, the quality of fused image was assessed through the investigation of information contents, image difference and spectral similarity. The measurements of entropy, unsupervised clustering, mean and variance, correlation coefficient and one newly index - index of noise increase, were performed respectively to the factors of quality. With the non-quantitative analysis, the influences of geometric, temporal, scale and radiometric inconsistencies by the multisensor fusion were issued. With the analysed results, certain degrees of improvement in classification were found by adopting the various techniques of multisensor fusion.
|Description:||ix, 111 leaves : ill. (chiefly col.) ; 30 cm.
PolyU Library Call No.: [THS] LG51 .H577M LSGI 2000 Lau
|URI:||http://hdl.handle.net/10397/3580||Rights:||All rights reserved.|
|Appears in Collections:||Thesis|
Show full item record
Files in This Item:
|b15177816_link.htm||For PolyU Users||162 B||HTML||View/Open|
|b15177816_ir.pdf||For All Users (Non-printable)||8.16 MB||Adobe PDF||View/Open|
Citations as of Oct 14, 2018
Citations as of Oct 14, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.