Please use this identifier to cite or link to this item:
Title: Sparsity and error analysis of empirical feature-based regularization schemes
Authors: Guo, X 
Fan, J
Zhou, DX
Keywords: Sparsity
Concave regularizer
Reproducing kernel Hilbert space
Regularization with empirical features
SCAD penalty
Issue Date: 2016
Publisher: MIT Press
Source: Journal of machine learning research, 2016, v. 17, no. 89, p. 1-34 How to cite?
Journal: Journal of machine learning research 
Abstract: We consider a learning algorithm generated by a regularization scheme with a concave regularizer for the purpose of achieving sparsity and good learning rates in a least squares regression setting. The regularization is induced for linear combinations of empirical features, constructed in the literatures of kernel principal component analysis and kernel projection machines, based on kernels and samples. In addition to the separability of the involved optimization problem caused by the empirical features, we carry out sparsity and error analysis, giving bounds in the norm of the reproducing kernel Hilbert space, based on a priori conditions which do not require assumptions on sparsity in terms of any basis or system. In particular, we show that as the concave exponent qq of the concave regularizer increases to 11, the learning ability of the algorithm improves. Some numerical simulations for both artificial and real MHC-peptide binding data involving the ℓqℓq regularizer and the SCAD penalty are presented to demonstrate the sparsity and error analysis.
ISSN: 1532-4435
EISSN: 1533-7928
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Aug 14, 2018

Page view(s)

Last Week
Last month
Citations as of Aug 19, 2018

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.