Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107695
PIRA download icon_1.1View/Download Full Text
Title: AdaptCL : adaptive continual learning for tackling heterogeneity in sequential datasets
Authors: Zhao, Y 
Saxena, D 
Cao, J 
Issue Date: Feb-2025
Source: IEEE transactions on neural networks and learning systems, Feb. 2025, v. 36, no. 2, p. 2509-2522
Abstract: Managing heterogeneous datasets that vary in complexity, size, and similarity in continual learning presents a significant challenge. Task-agnostic continual learning is necessary to address this challenge, as datasets with varying similarity pose difficulties in distinguishing task boundaries. Conventional task-agnostic continual learning practices typically rely on rehearsal or regularization techniques. However, rehearsal methods may struggle with varying dataset sizes and regulating the importance of old and new data due to rigid buffer sizes. Meanwhile, regularization methods apply generic constraints to promote generalization but can hinder performance when dealing with dissimilar datasets lacking shared features, necessitating a more adaptive approach. In this article, we propose a novel adaptive continual learning (AdaptCL) method to tackle heterogeneity in sequential datasets. AdaptCL employs fine-grained data-driven pruning to adapt to variations in data complexity and dataset size. It also utilizes task-agnostic parameter isolation to mitigate the impact of varying degrees of catastrophic forgetting caused by differences in data similarity. Through a two-pronged case study approach, we evaluate AdaptCL on both datasets of MNIST variants and DomainNet, as well as datasets from different domains. The latter include both large-scale, diverse binary-class datasets and few-shot, multiclass datasets. Across all these scenarios, AdaptCL consistently exhibits robust performance, demonstrating its flexibility and general applicability in handling heterogeneous datasets.
Keywords: Adaptation models
Adaptive continual learning (AdaptCL)
Complexity theory
Data models
Data-driven pruning
Heterogeneous datasets
Knowledge engineering
Learning systems
Manuals
Parameter isolation
Task analysis
Task-agnostic continual learning
Publisher: Institute of Electrical and Electronics Engineers
Journal: IEEE transactions on neural networks and learning systems 
ISSN: 2162-237X
EISSN: 2162-2388
DOI: 10.1109/TNNLS.2023.3341841
Rights: © 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
The following publication Y. Zhao, D. Saxena and J. Cao, "AdaptCL: Adaptive Continual Learning for Tackling Heterogeneity in Sequential Datasets," in IEEE Transactions on Neural Networks and Learning Systems, vol. 36, no. 2, pp. 2509-2522, Feb. 2025 is available at https://doi.org/10.1109/TNNLS.2023.3341841.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Zhao_AdaptCL_Adaptive_Continual.pdfPre-Published version14.56 MBAdobe PDFView/Open
Open Access Information
Status embargoed access
Embargo End Date null
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

89
Citations as of Apr 14, 2025

Downloads

24
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

6
Citations as of Dec 19, 2025

WEB OF SCIENCETM
Citations

2
Citations as of Dec 19, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.