Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/88216
PIRA download icon_1.1View/Download Full Text
Title: Sparsity and error analysis of empirical feature-based regularization schemes
Authors: Guo, X 
Fan, J
Zhou, DX
Issue Date: 13-Jun-2016
Source: Paper presented at 2016 ICSA Applied Statistics Symposium, Hyatt Regency Atlanta, Atlanta, Georgia, USA , 12-15 June 2016
Abstract: We consider a learning algorithm generated by a regularization scheme with a concave regularizer for the purpose of achieving sparsity and good learning rates in a least squares regression setting. The regularization is induced for linear combinations of empirical features, constructed in the literatures of kernel principal component analysis and kernel projection machines, based on kernels and samples. In addition to the separability of the involved optimization problem caused by the empirical features, we carry out sparsity and error analysis, giving bounds in the norm of the reproducing kernel Hilbert space, based on a priori conditions which do not require assumptions on sparsity in terms of any basis or system. In particular, we show that as the concave exponent q of the concave regularizer increases to 1, the learning ability of the algorithm improves. Some numerical simulations for both artificial and real MHC-peptide binding data involving the q regularizer and the SCAD penalty are presented to demonstrate the sparsity and error analysis.
Rights: Posted with permission of the author.
Appears in Collections:Presentation

Files in This Item:
File Description SizeFormat 
Atlanta2016Jun.pdf62.12 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Other Version
Show full item record

Page views

25
Citations as of May 15, 2022

Downloads

2
Citations as of May 15, 2022

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.