Back to results list
Show full item record
Please use this identifier to cite or link to this item:
|Title:||Boltzmann entropy for spatial information of images||Authors:||Gao, Peichao||Degree:||Ph.D.||Issue Date:||2018||Abstract:||Information theory originated in communications in the 1940s and has found applications in a broad range of disciplines such as biology, chemistry, ecology, neuroscience, and geoscience, leading to a series of new interdisciplinary fields including bio-informatics, chem-informatics, eco-informatics, neuro-informatics, and geo-informatics. In information theory, one of the most fundamental issues is the measurement of information content. So far, the most popular and widely accepted measure is the entropy developed by Claude Shannon in 1948, usually referred to as Shannon entropy. As this entropy is computed based on a probability distribution of the components of a dataset, it is a measure of statistical information and depends only on the composition of the dataset. The applicability of Shannon entropy to a spatial dataset (e.g., an image) has been severely limited because such a dataset contains not only compositional but also configurational information. Both information is useful and should be characterized in many applications. To solve this problem, calls have recently been made for revisiting Boltzmann entropy, which was proposed by Ludwig Boltzmann in 1872 but still remains largely at a conceptual level. Another reason behind these calls is the questioning of whether the Shannon entropy is thermodynamically relevant. This project aims to respond to these calls by tackling the computation and thermodynamic consistency issues of Boltzmann entropy with spatial datasets. Special attention has been paid to numerical raster data in general and images in particular, because such data represent the field-based data model that is common in many disciplines. In the development of information theory for various applications, a number of efforts have been made to improve Shannon entropy for characterizing spatial information, resulting in a variety of improved Shannon entropies and variants of Shannon entropy. Therefore, this project started with a systematic experimental evaluation of the performance of these entropies and variants as measures of the spatial information of an image. A set of five criteria were proposed and corresponding testing images were generated. The evaluation results revealed that none of these entropies and variants could satisfy all the criteria and thus serve as a measure of the spatial information of an image. This finding further necessitates tackling the computational issues of Boltzmann entropy with images. Boltzmann entropy has been defined as a function of the number of possible microscopic states (i.e., microstates) for a given macroscopic state (i.e., macrostate) in a thermodynamic system. The difficulty in the computation of Boltzmann entropy lies in the proper definition of macrostate and the precise determination of the number of microstates. To overcome this difficulty, a hierarchy-based (or multiscale) strategy was proposed in this project. Specifically, for a given image, the macrostate was defined as the up-scaled version, and the number of microstates was defined as all the possibilities for the up-scaled image to be down-scaled to the original scale. In this way, Boltzmann entropy could be computed between images at any two adjacent scales in the hierarchical (multiscale) representation. This entropy is termed "relative Boltzmann entropy", and there could be a number of relative entropies for a large-sized image. To take account of image size, "absolute Boltzmann entropy" was introduced as the sum of all the relative entropies. Experimental validation was carried out by using both simulated and real-life images. The results attested to the effectiveness of the proposed computation method. The computed Boltzmann entropies were capable of completely characterizing the spatial information of images. On the other hand, the method is not efficient in terms of computational time.
To improve the computational efficiency, two strategies were developed. One strategy was to reduce the amount of computation required to determine the total number of microstates by developing an analytical solution to replace the original numerical solution. The other strategy was to improve the amount of computation in each time unit by developing a parallel computing-based solution. Experimental evaluations were conducted using images of varying sizes (i.e., from 2×2 to 1,000×1,000 pixels). The computational time was found to decrease significantly, e.g., from 2557s to 41s for an image with 1,000×1,000 pixels, demonstrating the effectiveness of the two proposed strategies. As thermodynamic consistency is the prerequisite for interpreting an entropy based on thermodynamic insights, a systematic evaluation of the thermodynamic consistency of Boltzmann entropies was conducted. An evaluation method was developed based on the kinetic theory of gases: to simulate the gaseous mixing in a closed system using a series of images and then to examine whether or not an entropy showed an upward trend during the mixing. A total of 50,000 simulated images were generated. The evaluation results showed that the relative Boltzmann entropy was thermodynamically consistent, but the absolute Boltzmann entropy was only partially consistent. The cause of this problem was determined through a hypothesis-driven experiment, and two solutions were developed to solve the problem. Finally, Boltzmann entropy was applied to the optimal band selection of hyperspectral remote sensing images to demonstrate its usefulness. In band selection, the critical issue is the quantification of the similarity between two bands. This quantification was carried out by using the difference in Boltzmann entropy between two bands. Comparative evaluations were conducted with a 220-band image. The accuracy of classification was used as a measure for the comparison. The results showed that the classification accuracy was greatly improved (up to 27% with 20 selected bands) in comparison with traditional Shannon entropy-based methods. In addition, another comparison with some state-of-the-art methods was conducted. The results show that the proposed method is still very competitive; it outperformed all the others when the number of selected bands ranges from 18 to 23. It could be expected that Boltzmann entropy may form a new basis for information-theoretic approaches to image processing and even for spatial information science in a broad sense. This entropy may also open a door for studying the thermodynamics of geographical pattern-process relationships across scales in space and time.
|Subjects:||Hong Kong Polytechnic University -- Dissertations
Entropy (Information theory)
Spatial analysis (Statistics)
|Pages:||ix, 150 pages : color illustrations|
|Appears in Collections:||Thesis|
View full-text via https://theses.lib.polyu.edu.hk/handle/200/9666
Citations as of May 15, 2022
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.