Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/98513
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Applied Mathematics | en_US |
| dc.creator | Taskesen, B | en_US |
| dc.creator | Yue, MC | en_US |
| dc.creator | Blanchet, J | en_US |
| dc.creator | Kuhn, D | en_US |
| dc.creator | Nguyen, VA | en_US |
| dc.date.accessioned | 2023-05-10T01:59:58Z | - |
| dc.date.available | 2023-05-10T01:59:58Z | - |
| dc.identifier.issn | 2640-3498 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/98513 | - |
| dc.description | 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual | en_US |
| dc.language.iso | en | en_US |
| dc.publisher | PMLR web site | en_US |
| dc.rights | Copyright 2021 by the author(s). | en_US |
| dc.rights | Posted with permission of the author. | en_US |
| dc.title | Sequential domain adaptation by synthesizing distributionally robust experts | en_US |
| dc.type | Conference Paper | en_US |
| dc.identifier.spage | 10162 | en_US |
| dc.identifier.epage | 10172 | en_US |
| dc.identifier.volume | 139 | en_US |
| dcterms.abstract | Least squares estimators, when trained on few target domain samples, may predict poorly. Supervised domain adaptation aims to improve the predictive accuracy by exploiting additional labeled training samples from a source distribution that is close to the target distribution. Given available data, we investigate novel strategies to synthesize a family of least squares estimator experts that are robust with regard to moment conditions. When these moment conditions are specified using Kullback-Leibler or Wasserstein-type divergences, we can find the robust estimators efficiently using convex optimization. We use the Bernstein online aggregation algorithm on the proposed family of robust experts to generate predictions for the sequential stream of target test samples. Numerical experiments on real data show that the robust strategies systematically outperform non-robust interpolations of the empirical least squares estimators. | en_US |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | Proceedings of Machine Learning Research, 2021, v. 139, p. 10162-10172 | en_US |
| dcterms.isPartOf | Proceedings of Machine Learning Research | en_US |
| dcterms.issued | 2021 | - |
| dc.relation.conference | International Conference on Machine Learning [ICML] | en_US |
| dc.description.validate | 202305 bcch | en_US |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | AMA-0032 | - |
| dc.description.fundingSource | RGC | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.identifier.OPUS | 55425991 | - |
| dc.description.oaCategory | Copyright retained by author | en_US |
| Appears in Collections: | Conference Paper | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| taskesen21a.pdf | 756.21 kB | Adobe PDF | View/Open |
Page views
126
Last Week
7
7
Last month
Citations as of Nov 10, 2025
Downloads
30
Citations as of Nov 10, 2025
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



