Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/37721
Title: Optimizing radial basis probabilistic neural networks using recursive orthogonal least squares algorithms combined with micro-genetic algorithms
Authors: Zhao, W
Huang, DS
Guo, L
Keywords: Agriculture
Genetic algorithms
Least squares approximations
Pattern classification
Radial basis function networks
Issue Date: 2003
Source: Proceedings of the International Joint Conference on Neural Networks (IJCNN'2003), Portland, Oregon, 20-24 July 2003, p. 2277-2282 How to cite?
Abstract: The paper focuses on discussing how to train and optimize the radial basis probabilistic neural network (RBPNN) structure by Recursive Orthogonal Least Squares Algorithms (ROLSA) combined with Micro-Genetic Algorithms (μ-GA). First, the previous ROLSA, used for optimally selecting the hidden centers of the RBPNN, was improved in two aspects, i.e., adopting new double error criterions and new stop condition. Secondly, the micro-genetic algorithm, used for optimizing the controlling parameter of kernel function, was incorporated in the improved ROLSA in order that the structure of RBPNN can be entirely optimized. Finally, to demonstrate the power of our approach, two examples, i.e., both two spirals classification problem and IRIS classification problem, were employed to validate the performance of the classification. The experimental result showed that, for the two spirals problem the structure of the RBPNN with 200 initial hidden centers was considerably compressed into the one with 30 hidden centers, and for the IRIS classification problem only 9 hidden centers among 75 initial hidden centers were selected for the optimized RBPNN structure. Whereas for the Radial Basis Function Neural Network (RBFNN), under the same condition, for the two spirals problem, there were still 46 hidden centers left, and for the IRIS problem 15 hidden centers were selected into the optimal structure of the RBFNN. Moreover, the experimental results also illustrated that the generalization performance of the optimized RBPNN for the two examples was obviously better than the one of the optimized RBFNN.
URI: http://hdl.handle.net/10397/37721
ISBN: 0-7803-7898-9
DOI: 10.1109/IJCNN.2003.1223766
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

40
Last Week
4
Last month
Checked on Aug 14, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.