Please use this identifier to cite or link to this item:
Title: Improved biomass estimation using the texture parameters of two high-resolution optical sensors
Authors: Nichol, JE 
Sarker, MLR
Keywords: Ecology
Geophysical image processing
Principal component analysis
Regression analysis
Vegetation mapping
AVNIR-2 instrument
SPOT-5 instrument
Allometric model
Biophysical complexity
Climate change modeling
Environmental complexity
Forest biomass estimation
Forest ecosystem
Greenhouse gas inventory
Image processing technique
Multiple regression model
Optical sensor
Saturation level
Terrestrial carbon accounting
Texture parameter
Topographic complexity
Biomass estimation
Texture measurement
Issue Date: 2011
Publisher: Institute of Electrical and Electronics Engineers
Source: IEEE transactions on geoscience and remote sensing, 2011, v. 49, no. 3, p. 930-948 How to cite?
Journal: IEEE transactions on geoscience and remote sensing 
Abstract: Accurate forest biomass estimation is essential for greenhouse gas inventories, terrestrial carbon accounting, and climate change modeling studies. Unfortunately, no universal and transferable technique has been developed so far to quantify biomass carbon sources and sinks over large areas because of the environmental, topographic, and biophysical complexity of forest ecosystems. Among the remote sensing techniques tested, the use of multisensors and the spatial as well as the spectral characteristics of the data have demonstrated a strong potential for forest biomass estimation. However, the use of multisensor data accompanied by spatial data processing has not been fully investigated because of the unavailability of appropriate data sets and the complexity of image processing techniques in combining multisensor data with the analysis of the spatial characteristics. This paper investigates the texture parameters of two high resolution (10 m) optical sensors (Advanced Visible and Near Infrared Radiometer type 2 (AVNIR-2) and SPOT-5) in different processing combinations for biomass estimation. Multiple regression models are developed between image parameters extracted from the different stages of image processing and the biomass of 50 field plots, which was estimated using a newly developed "allometric model" for the study region. The results demonstrate a clear improvement in biomass estimation using the texture parameters of a single sensor (r2 = 0.854 and rmse = 38.54) compared to the best result obtained from simple spectral reflectance (r2 = 0.494) and simple spectral band ratios (r2 = 0.59). This was further improved to obtain a very promising result using the texture parameter of both sensors together (r2 = 0.897 and rmse = 32.38), the texture parameters from the principal component analysis of both sensors (r2 = 0.851 and rmse = 38.80), and the texture parameters from the av eraging of both sensors (r2 = - - 0.911 and rmse = 30.10). Improvement was also observed using the simple ratio of the texture parameters of AVNIR-2 (r2 = 0.899 and rmse = 32.04) and SPOT-5 (r2 = 0.916), and finally, the most promising result (r2 = 0.939 and rmse = 24.77) was achieved using the ratios of the texture parameters of both sensors together. This high level of agreement between the field and image data derived from the two novel techniques (i.e., combination/fusion of the multisensor data and the ratio of the texture parameters) is a very significant improvement over previous work where agreement not exceeding r2 = 0.65 has been achieved using optical sensors. Furthermore, biomass estimates of up to 500 t/ha in our study area far exceed the saturation levels observed in other studies using optical sensors.
ISSN: 0196-2892
EISSN: 1558-0644
DOI: 10.1109/TGRS.2010.2068574
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jun 5, 2018


Last Week
Last month
Citations as of Jun 22, 2018

Page view(s)

Last Week
Last month
Citations as of Jun 18, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.