Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/12228
Title: Theoretically optimal parameter choices for support vector regression machines with noisy input
Authors: Wang, S
Zhu, J
Chung, FL 
Lin, Q
Hu, D
Keywords: Huber loss functions
Norm-r loss functions
Regularized linear regression
Support vectors
Issue Date: 2005
Publisher: Springer
Source: Soft computing, 2005, v. 9, no. 10, p. 732-741 How to cite?
Journal: Soft computing 
Abstract: With the evidence framework, the regularized linear regression model can be explained as the corresponding MAP problem in this paper, and the general dependency relationships that the optimal parameters in this model with noisy input should follow is then derived. The support vector regression machines Huber-SVR and Norm-r r-SVR are two typical examples of this model and their optimal parameter choices are paid particular attention. It turns out that with the existence of the typical Gaussian noisy input, the parameter μ in Huber-SVR has the linear dependency with the input noise, and the parameter r in the r-SVR has the inversely proportional to the input noise. The theoretical results here will be helpful for us to apply kernel-based regression techniques effectively in practical applications.
URI: http://hdl.handle.net/10397/12228
ISSN: 1432-7643
DOI: 10.1007/s00500-004-406-3
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

SCOPUSTM   
Citations

14
Last Week
0
Last month
0
Citations as of Sep 11, 2017

WEB OF SCIENCETM
Citations

9
Last Week
0
Last month
0
Citations as of Sep 15, 2017

Page view(s)

35
Last Week
0
Last month
Checked on Sep 17, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.