Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/83375
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematics-
dc.creatorWang, Chendi-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/9498-
dc.language.isoEnglish-
dc.titleLearning with centered reproducing kernels-
dc.typeThesis-
dcterms.abstractIn the past twenty years, reproducing kernels and the kernel-based learning algorithms have been widely and successfully applied to many areas of scientific research and industry, and are extensively studied. Many of these algorithms take the form of an optimization problem. Typically, the objective function consists of a fidelity term for fitting the observations, and a regularization term for preventing over-fitting. Examples include the support vector machines for classification, and the regularized least squares for regression. However, in many regression problems, the constant component should be treated differently in the regression function, and the existing kernel methods are not perfect tools to model this difference. Examples include score-based ranking function regression. In this thesis, we study a class of Centered Reproducing Kernels (CRKs), which separate the constant component from the reproducing kernel Hilbert spaces. We provide the non-asymptotic convergence analysis of the empirical CRK-based regularized least squares.-
dcterms.accessRightsopen access-
dcterms.educationLevelM.Phil.-
dcterms.extentx, 66 pages-
dcterms.issued2018-
dcterms.LCSHHong Kong Polytechnic University -- Dissertations-
dcterms.LCSHKernel functions-
dcterms.LCSHHilbert space-
Appears in Collections:Thesis
Show simple item record

Page views

46
Last Week
0
Last month
Citations as of May 12, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.