Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/31118
Title: Localized generalization error and its application to RBFNN training
Authors: Ng, WWY
Yeung, DS
Tsang, ECC
Wang, XZ
Keywords: Error analysis
Generalisation (artificial intelligence)
Learning (artificial intelligence)
Pattern classification
Radial basis function networks
Issue Date: 2005
Publisher: IEEE
Source: Proceedings of 2005 International Conference on Machine Learning and Cybernetics, 2005, 18-21 August 2005, Guangzhou, China, v. 8, p. 4667-4673 How to cite?
Abstract: The generalization error bounds for the entire input space found by current error models using the number of effective parameters of a classifier and the number of training samples are usually very loose. But classifiers such as SVM, RBFNN and MLPNN, are really local learning machines used for many application problems, which consider unseen samples close to the training samples more important. In this paper, we propose a localized generalization error model which bounds above the generalization error within a neighborhood of the training samples using stochastic sensitivity measure (expectation of the squared output perturbations). It is then used to develop a model selection technique for a classifier with maximal coverage of unseen samples by specifying a generalization error threshold. Experiments by using eight real world datasets show that, in comparison with cross-validation, sequential learning, and two other ad-hoc methods, our technique consistently yields the best testing classification accuracy with fewer hidden neurons and less training time.
URI: http://hdl.handle.net/10397/31118
ISBN: 0-7803-9091-1
DOI: 10.1109/ICMLC.2005.1527762
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

17
Last Week
0
Last month
Checked on May 28, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.