Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/75243
Title: Sensitivity and generalization of SVM with weighted and reduced features
Authors: Hu, YX 
Liu, JNK 
Jia, LW
Issue Date: 2012
Publisher: Springer
Source: In H Dai, JNK Liu & E Smirnov (Eds.), Reliable knowledge discovery, p. 161-182. New York: Springer, 2012 How to cite?
Abstract: Support Vector Machine, as a modern statistical learning method based on the principle of structure risk minimization rather than the empirical risk minimization, has been widely applied to the small-sample, non-linear and high-dimensional problems. Many new versions of SVM have been proposed to improve the performance SVM. Some of the new versions focus on processing the features of SVM. For example, give the features weight values or reduce some unnecessary features. A new feature weighted SVM and a feature reduced SVM are proposed in this chapter. The two versions of SVM are applied to the regression works to predict the price of a certain stock, and the outputs are compared with classical SVM. The results showed that the proposed feature weighted SVM can improve the accuracy of the regression, and the proposed featured reduced SVM is sensitive to the data sample for testing.
URI: http://hdl.handle.net/10397/75243
ISBN: 9781461419037 (electronic bk.)
1461419034 (electronic bk.)
9781461419020 (print)
DOI: 10.1007/978-1-4614-1903-7
Appears in Collections:Book Chapter

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

2
Citations as of May 9, 2018

Page view(s)

1
Citations as of May 21, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.