Please use this identifier to cite or link to this item:
Title: A fast kernel-based nonlinear discriminant analysis for multi-class problems
Authors: Xu, Y
Zhang, D 
Jin, Z
Li, M
Yang, JY
Keywords: Face recognition
Fast kernel-based nonlinear method
Feature extraction
Fisher discriminant analysis
Kernel-based nonlinear discriminant analysis
Pattern recognition
Issue Date: 2006
Publisher: Elsevier
Source: Pattern recognition, 2006, v. 39, no. 6, p. 1026-1033 How to cite?
Journal: Pattern recognition 
Abstract: Nonlinear discriminant analysis may be transformed into the form of kernel-based discriminant analysis. Thus, the corresponding discriminant direction can be solved by linear equations. From the view of feature space, the nonlinear discriminant analysis is still a linear method, and it is provable that in feature space the method is equivalent to Fisher discriminant analysis. We consider that one linear combination of parts of training samples, called "significant nodes", can replace the total training samples to express the corresponding discriminant vector in feature space to some extent. In this paper, an efficient algorithm is proposed to determine "significant nodes" one by one. The principle of determining "significant nodes" is simple and reasonable, and the consequent algorithm can be carried out with acceptable computation cost. Depending on the kernel functions between test samples and all "significant nodes", classification can be implemented. The proposed method is called fast kernel-based nonlinear method (FKNM). It is noticeable that the number of "significant nodes" may be much smaller than that of the total training samples. As a result, for two-class classification problems, the FKNM will be much more efficient than the naive kernel-based nonlinear method (NKNM). The FKNM can be also applied to multi-class via two approaches: one-against-the-rest and one-against-one. Although there is a view that one-against-one is superior to one-against-the-rest in classification efficiency, it seems that for the FKNM one-against-the-rest is more efficient than one-against-one. Experiments on benchmark and real datasets illustrate that, for two-class and multi-class classifications, the FKNM is effective, feasible and much efficient.
ISSN: 0031-3203
EISSN: 1873-5142
DOI: 10.1016/j.patcog.2005.10.029
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Sep 10, 2018


Last Week
Last month
Citations as of Sep 17, 2018

Page view(s)

Last Week
Last month
Citations as of Sep 17, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.