Please use this identifier to cite or link to this item:
Title: Sensitivity and generalization of SVM with weighted and reduced features
Authors: Hu, YX 
Liu, JNK 
Jia, LW
Issue Date: 2012
Publisher: Springer
Source: In H Dai, JNK Liu & E Smirnov (Eds.), Reliable knowledge discovery, p. 161-182. New York: Springer, 2012 How to cite?
Abstract: Support Vector Machine, as a modern statistical learning method based on the principle of structure risk minimization rather than the empirical risk minimization, has been widely applied to the small-sample, non-linear and high-dimensional problems. Many new versions of SVM have been proposed to improve the performance SVM. Some of the new versions focus on processing the features of SVM. For example, give the features weight values or reduce some unnecessary features. A new feature weighted SVM and a feature reduced SVM are proposed in this chapter. The two versions of SVM are applied to the regression works to predict the price of a certain stock, and the outputs are compared with classical SVM. The results showed that the proposed feature weighted SVM can improve the accuracy of the regression, and the proposed featured reduced SVM is sensitive to the data sample for testing.
ISBN: 9781461419037 (electronic bk.)
1461419034 (electronic bk.)
9781461419020 (print)
DOI: 10.1007/978-1-4614-1903-7
Appears in Collections:Book Chapter

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Nov 3, 2018

Page view(s)

Citations as of Nov 11, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.