Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/91840
Title: Online gradient descent algorithms for functional data learning
Authors: Chen, X
Tang, B
Fan, J
Guo, X 
Issue Date: 2021
Source: Journal of complexity, 2021, In Press, 101635, https://doi.org/10.1016/j.jco.2021.101635
Abstract: Functional linear model is a fruitfully applied general framework for regression problems, including those with intrinsically infinite-dimensional data. Online gradient descent methods, despite their evidenced power of processing online or large-sized data, are not well studied for learning with functional data. In this paper, we study reproducing kernel-based online learning algorithms for functional data, and derive convergence rates for the expected excess prediction risk under both online and finite-horizon settings of step-sizes respectively. It is well understood that nontrivial uniform convergence rates for the estimation task depend on the regularity of the slope function. Surprisingly, the convergence rates we derive for the prediction task can assume no regularity from slope. Our analysis reveals the intrinsic difference between the estimation task and the prediction task in functional data learning.
Keywords: Learning theory
Online learning
Gradient descent
Reproducing kernel Hilbert space
Error analysis
Publisher: Academic Press
Journal: Journal of complexity 
ISSN: 0885-064X
EISSN: 1090-2708
DOI: 10.1016/j.jco.2021.101635
Appears in Collections:Journal/Magazine Article

Open Access Information
Status embargoed access
Embargo End Date 0000-00-00 (to be updated)
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

29
Citations as of May 22, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.