Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/91840
Title: Online gradient descent algorithms for functional data learning
Authors: Chen, X
Tang, B
Fan, J
Guo, X 
Issue Date: Jun-2022
Source: Journal of complexity, June 2022, v. 70, 101635
Abstract: Functional linear model is a fruitfully applied general framework for regression problems, including those with intrinsically infinite-dimensional data. Online gradient descent methods, despite their evidenced power of processing online or large-sized data, are not well studied for learning with functional data. In this paper, we study reproducing kernel-based online learning algorithms for functional data, and derive convergence rates for the expected excess prediction risk under both online and finite-horizon settings of step-sizes respectively. It is well understood that nontrivial uniform convergence rates for the estimation task depend on the regularity of the slope function. Surprisingly, the convergence rates we derive for the prediction task can assume no regularity from slope. Our analysis reveals the intrinsic difference between the estimation task and the prediction task in functional data learning.
Keywords: Learning theory
Online learning
Gradient descent
Reproducing kernel Hilbert space
Error analysis
Publisher: Academic Press
Journal: Journal of complexity 
ISSN: 0885-064X
EISSN: 1090-2708
DOI: 10.1016/j.jco.2021.101635
Appears in Collections:Journal/Magazine Article

Open Access Information
Status embargoed access
Embargo End Date 2024-06-30
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

121
Last Week
1
Last month
Citations as of Apr 21, 2024

SCOPUSTM   
Citations

11
Citations as of Apr 19, 2024

WEB OF SCIENCETM
Citations

6
Citations as of Feb 29, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.