Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/33732
Title: Accelerating the kernel-method-based feature extraction procedure from the viewpoint of numerical approximation
Authors: Xu, Y
Zhang, D 
Keywords: Feature extraction
Kernel methods
Kernel minimum squared error
Kernel PCA
Pattern recognition
Issue Date: 2011
Source: Neural computing and applications, 2011, v. 20, no. 7, p. 1087-1096 How to cite?
Journal: Neural Computing and Applications 
Abstract: The kernel method suffers from the following problem: the computational efficiency of the feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, from a novel viewpoint, we propose a very simple and mathematically tractable method to produce the computationally efficient kernel-method-based feature extraction procedure. We first address the issue that how to make the feature extraction result of the reformulated kernel method well approximate that of the naïve kernel method. We identify these training samples that statistically contribute much to the feature extraction results and exploit them to reformulate the kernel method to produce the computationally efficient kernel-method-based feature extraction procedure. Indeed, the proposed method has the following basic idea: when one training sample has little effect on the feature extraction result and statistically has the high correlation with regard to all the training samples, the feature extraction term associated with this training sample can be removed from the feature extraction procedure. The proposed method has the following advantages: First, it proposes, for the first time, to improve the kernel method through formal and reasonable evaluation on the feature extraction term. Second, the proposed method improves the kernel method at a low extra cost and thus has a much more computationally efficient training phase than most of the previous improvements to the kernel method. The experimental comparison shows that the proposed method performs well in classification problems. This paper also intuitively shows the geometrical relation between the identified training samples and other training samples.
URI: http://hdl.handle.net/10397/33732
ISSN: 0941-0643
DOI: 10.1007/s00521-011-0534-5
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

3
Last Week
0
Last month
1
Citations as of Dec 7, 2017

WEB OF SCIENCETM
Citations

5
Last Week
0
Last month
1
Citations as of Dec 9, 2017

Page view(s)

71
Last Week
2
Last month
Checked on Dec 10, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.