Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114206
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Data Science and Artificial Intelligenceen_US
dc.creatorChen, Sen_US
dc.creatorLong, Gen_US
dc.creatorJiang, Jen_US
dc.creatorZhang, Cen_US
dc.date.accessioned2025-07-15T08:45:45Z-
dc.date.available2025-07-15T08:45:45Z-
dc.identifier.urihttp://hdl.handle.net/10397/114206-
dc.description39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025, Philadelphia, 25 February-4 March 2025en_US
dc.language.isoenen_US
dc.publisherAssociation for the Advancement of Artificial Intelligenceen_US
dc.rightsCopyright © 2025, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.en_US
dc.rightsThis is the author's manuscript of the following paper: Chen, S., Long, G., Jiang, J., & Zhang, C. (2025). Federated Foundation Models on Heterogeneous Time Series. Proceedings of the AAAI Conference on Artificial Intelligence, 39(15), 15839-15847, which is available at https://doi.org/10.1609/aaai.v39i15.33739.en_US
dc.titleFederated foundation models on heterogeneous time seriesen_US
dc.typeConference Paperen_US
dc.identifier.spage15839en_US
dc.identifier.epage15847en_US
dc.identifier.volume39en_US
dc.identifier.issue15en_US
dc.identifier.doi10.1609/aaai.v39i15.33739en_US
dcterms.abstractTraining a general-purpose time series foundation models with robust generalization capabilities across diverse applications from scratch is still an open challenge. Efforts are primarily focused on fusing cross-domain time series datasets to extract shared subsequences as tokens for training models on Transformer architecture. However, due to significant statistical heterogeneity across domains, this cross-domain fusing approach doesn’t work effectively as the same as fusing texts and images. To tackle this challenge, this paper proposes a novel federated learning approach to address the heterogeneity in time series foundation models training, namely FFTS. Specifically, each data-holding organization is treated as an independent client in a collaborative learning framework with federated settings, and then many client-specific local models will be trained to preserve the unique characteristics per dataset. Moreover, a new regularization mechanism will be applied to both client-side and server-side, thus to align the shared knowledge across heterogeneous datasets from different domains. Extensive experiments on benchmark datasets demonstrate the effectiveness of the proposed federated learning approach. The newly learned time series foundation models achieve superior generalization capabilities on cross-domain time series analysis tasks.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIn Proceedings of the AAAI Conference on Artificial Intelligence, v. 39, no. 15, p. 15839-15847en_US
dcterms.issued2025-04-
dc.identifier.scopus2-s2.0-105004003106-
dc.relation.conferenceConference on Artificial Intelligence [AAAI]en_US
dc.description.validate202507 bcwhen_US
dc.description.oaAccepted Manuscripten_US
dc.identifier.FolderNumbera3866-
dc.identifier.SubFormID51468-
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
C248.AAAI-Camery-ready.pdfPre-Published version7.17 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.