Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/98513
| Title: | Sequential domain adaptation by synthesizing distributionally robust experts | Authors: | Taskesen, B Yue, MC Blanchet, J Kuhn, D Nguyen, VA |
Issue Date: | 2021 | Source: | Proceedings of Machine Learning Research, 2021, v. 139, p. 10162-10172 | Abstract: | Least squares estimators, when trained on few target domain samples, may predict poorly. Supervised domain adaptation aims to improve the predictive accuracy by exploiting additional labeled training samples from a source distribution that is close to the target distribution. Given available data, we investigate novel strategies to synthesize a family of least squares estimator experts that are robust with regard to moment conditions. When these moment conditions are specified using Kullback-Leibler or Wasserstein-type divergences, we can find the robust estimators efficiently using convex optimization. We use the Bernstein online aggregation algorithm on the proposed family of robust experts to generate predictions for the sequential stream of target test samples. Numerical experiments on real data show that the robust strategies systematically outperform non-robust interpolations of the empirical least squares estimators. | Publisher: | PMLR web site | Journal: | Proceedings of Machine Learning Research | ISSN: | 2640-3498 | Description: | 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual | Rights: | Copyright 2021 by the author(s). Posted with permission of the author. |
| Appears in Collections: | Conference Paper |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| taskesen21a.pdf | 756.21 kB | Adobe PDF | View/Open |
Page views
126
Last Week
7
7
Last month
Citations as of Nov 10, 2025
Downloads
30
Citations as of Nov 10, 2025
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



