Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/91840
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Applied Mathematics | en_US |
dc.creator | Chen, X | en_US |
dc.creator | Tang, B | en_US |
dc.creator | Fan, J | en_US |
dc.creator | Guo, X | en_US |
dc.date.accessioned | 2021-12-23T02:14:45Z | - |
dc.date.available | 2021-12-23T02:14:45Z | - |
dc.identifier.issn | 0885-064X | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/91840 | - |
dc.language.iso | en | en_US |
dc.publisher | Academic Press | en_US |
dc.rights | © 2021 Elsevier Inc. All rights reserved. | en_US |
dc.rights | © 2021. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/. | en_US |
dc.rights | The following publication Chen, X., Tang, B., Fan, J., & Guo, X. (2022). Online gradient descent algorithms for functional data learning. Journal of Complexity, 70, 101635 is available at https://dx.doi.org/10.1016/j.jco.2021.101635. | en_US |
dc.subject | Learning theory | en_US |
dc.subject | Online learning | en_US |
dc.subject | Gradient descent | en_US |
dc.subject | Reproducing kernel Hilbert space | en_US |
dc.subject | Error analysis | en_US |
dc.title | Online gradient descent algorithms for functional data learning | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.volume | 70 | en_US |
dc.identifier.doi | 10.1016/j.jco.2021.101635 | en_US |
dcterms.abstract | Functional linear model is a fruitfully applied general framework for regression problems, including those with intrinsically infinite-dimensional data. Online gradient descent methods, despite their evidenced power of processing online or large-sized data, are not well studied for learning with functional data. In this paper, we study reproducing kernel-based online learning algorithms for functional data, and derive convergence rates for the expected excess prediction risk under both online and finite-horizon settings of step-sizes respectively. It is well understood that nontrivial uniform convergence rates for the estimation task depend on the regularity of the slope function. Surprisingly, the convergence rates we derive for the prediction task can assume no regularity from slope. Our analysis reveals the intrinsic difference between the estimation task and the prediction task in functional data learning. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Journal of complexity, June 2022, v. 70, 101635 | en_US |
dcterms.isPartOf | Journal of complexity | en_US |
dcterms.issued | 2022-06 | - |
dc.identifier.eissn | 1090-2708 | en_US |
dc.identifier.artn | 101635 | en_US |
dc.description.validate | 202112 bcvc | en_US |
dc.description.oa | Accepted Manuscript | en_US |
dc.identifier.FolderNumber | a1122-n01 | - |
dc.identifier.SubFormID | 43965 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | RGC: 15304917 | en_US |
dc.description.fundingText | Others: ZE8Q | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | Green (AAM) | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ChenEtAl21.pdf | Pre-Published version | 699.47 kB | Adobe PDF | View/Open |
Page views
169
Last Week
1
1
Last month
Citations as of Apr 14, 2025
Downloads
29
Citations as of Apr 14, 2025
SCOPUSTM
Citations
12
Citations as of Jun 21, 2024
WEB OF SCIENCETM
Citations
20
Citations as of Apr 24, 2025

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.