Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/114206
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Data Science and Artificial Intelligence | en_US |
dc.creator | Chen, S | en_US |
dc.creator | Long, G | en_US |
dc.creator | Jiang, J | en_US |
dc.creator | Zhang, C | en_US |
dc.date.accessioned | 2025-07-15T08:45:45Z | - |
dc.date.available | 2025-07-15T08:45:45Z | - |
dc.identifier.uri | http://hdl.handle.net/10397/114206 | - |
dc.description | 39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025, Philadelphia, 25 February-4 March 2025 | en_US |
dc.language.iso | en | en_US |
dc.publisher | Association for the Advancement of Artificial Intelligence | en_US |
dc.rights | Copyright © 2025, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. | en_US |
dc.rights | This is the author's manuscript of the following paper: Chen, S., Long, G., Jiang, J., & Zhang, C. (2025). Federated Foundation Models on Heterogeneous Time Series. Proceedings of the AAAI Conference on Artificial Intelligence, 39(15), 15839-15847, which is available at https://doi.org/10.1609/aaai.v39i15.33739. | en_US |
dc.title | Federated foundation models on heterogeneous time series | en_US |
dc.type | Conference Paper | en_US |
dc.identifier.spage | 15839 | en_US |
dc.identifier.epage | 15847 | en_US |
dc.identifier.volume | 39 | en_US |
dc.identifier.issue | 15 | en_US |
dc.identifier.doi | 10.1609/aaai.v39i15.33739 | en_US |
dcterms.abstract | Training a general-purpose time series foundation models with robust generalization capabilities across diverse applications from scratch is still an open challenge. Efforts are primarily focused on fusing cross-domain time series datasets to extract shared subsequences as tokens for training models on Transformer architecture. However, due to significant statistical heterogeneity across domains, this cross-domain fusing approach doesn’t work effectively as the same as fusing texts and images. To tackle this challenge, this paper proposes a novel federated learning approach to address the heterogeneity in time series foundation models training, namely FFTS. Specifically, each data-holding organization is treated as an independent client in a collaborative learning framework with federated settings, and then many client-specific local models will be trained to preserve the unique characteristics per dataset. Moreover, a new regularization mechanism will be applied to both client-side and server-side, thus to align the shared knowledge across heterogeneous datasets from different domains. Extensive experiments on benchmark datasets demonstrate the effectiveness of the proposed federated learning approach. The newly learned time series foundation models achieve superior generalization capabilities on cross-domain time series analysis tasks. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In Proceedings of the AAAI Conference on Artificial Intelligence, v. 39, no. 15, p. 15839-15847 | en_US |
dcterms.issued | 2025-04 | - |
dc.identifier.scopus | 2-s2.0-105004003106 | - |
dc.relation.conference | Conference on Artificial Intelligence [AAAI] | en_US |
dc.description.validate | 202507 bcwh | en_US |
dc.description.oa | Accepted Manuscript | en_US |
dc.identifier.FolderNumber | a3866 | - |
dc.identifier.SubFormID | 51468 | - |
dc.description.fundingSource | Self-funded | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | Green (AAM) | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
C248.AAAI-Camery-ready.pdf | Pre-Published version | 7.17 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.