Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/105495
| Title: | A closer look at the training strategy for modern meta-learning | Authors: | Chen, J Wu, XM Li, Y Li, Q Zhan, LM Chung, FL |
Issue Date: | 2020 | Source: | Advances in neural information processing systems, 2020, v. 33, p. 396-406 | Abstract: | The support/query (S/Q) episodic training strategy has been widely used in modern meta-learning algorithms and is believed to improve their generalization ability to test environments. This paper conducts a theoretical investigation of this training strategy on generalization. From a stability perspective, we analyze the generalization error bound of generic meta-learning algorithms trained with such strategy. We show that the S/Q episodic training strategy naturally leads to a counterintuitive generalization bound of O(1/√n), which only depends on the task number n but independent of the inner-task sample size m. Under the common assumption m << n for few-shot learning, the bound of O(1/√n) implies strong generalization guarantees for modern meta-learning algorithms in the few-shot regime. To further explore the influence of training strategies on generalization, we propose a leave-one-out (LOO) training strategy for meta-learning and compare it with S/Q training. Experiments on standard few-shot regression and classification tasks with popular meta-learning algorithms validate our analysis. | Publisher: | NeurIPS | Journal: | Advances in neural information processing systems | Description: | 34th Conference on Neural Information Processing Systems (NeurIPS 2020), 6-12 December 2020, Online | Rights: | Posted with permission of the author. |
| Appears in Collections: | Conference Paper |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Chen_Closer_Look_Training.pdf | 387.07 kB | Adobe PDF | View/Open |
Page views
112
Last Week
6
6
Last month
Citations as of Nov 9, 2025
Downloads
17
Citations as of Nov 9, 2025
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.


