Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114206
PIRA download icon_1.1View/Download Full Text
Title: Federated foundation models on heterogeneous time series
Authors: Chen, S
Long, G
Jiang, J
Zhang, C 
Issue Date: Apr-2025
Source: In Proceedings of the AAAI Conference on Artificial Intelligence, v. 39, no. 15, p. 15839-15847
Abstract: Training a general-purpose time series foundation models with robust generalization capabilities across diverse applications from scratch is still an open challenge. Efforts are primarily focused on fusing cross-domain time series datasets to extract shared subsequences as tokens for training models on Transformer architecture. However, due to significant statistical heterogeneity across domains, this cross-domain fusing approach doesn’t work effectively as the same as fusing texts and images. To tackle this challenge, this paper proposes a novel federated learning approach to address the heterogeneity in time series foundation models training, namely FFTS. Specifically, each data-holding organization is treated as an independent client in a collaborative learning framework with federated settings, and then many client-specific local models will be trained to preserve the unique characteristics per dataset. Moreover, a new regularization mechanism will be applied to both client-side and server-side, thus to align the shared knowledge across heterogeneous datasets from different domains. Extensive experiments on benchmark datasets demonstrate the effectiveness of the proposed federated learning approach. The newly learned time series foundation models achieve superior generalization capabilities on cross-domain time series analysis tasks.
Publisher: Association for the Advancement of Artificial Intelligence
DOI: 10.1609/aaai.v39i15.33739
Description: 39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025, Philadelphia, 25 February-4 March 2025
Rights: Copyright © 2025, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
This is the author's manuscript of the following paper: Chen, S., Long, G., Jiang, J., & Zhang, C. (2025). Federated Foundation Models on Heterogeneous Time Series. Proceedings of the AAAI Conference on Artificial Intelligence, 39(15), 15839-15847, which is available at https://doi.org/10.1609/aaai.v39i15.33739.
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
C248.AAAI-Camery-ready.pdfPre-Published version7.17 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.