Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98513
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorTaskesen, Ben_US
dc.creatorYue, MCen_US
dc.creatorBlanchet, Jen_US
dc.creatorKuhn, Den_US
dc.creatorNguyen, VAen_US
dc.date.accessioned2023-05-10T01:59:58Z-
dc.date.available2023-05-10T01:59:58Z-
dc.identifier.issn2640-3498en_US
dc.identifier.urihttp://hdl.handle.net/10397/98513-
dc.description38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtualen_US
dc.language.isoenen_US
dc.publisherPMLR web siteen_US
dc.rightsCopyright 2021 by the author(s).en_US
dc.rightsPosted with permission of the author.en_US
dc.titleSequential domain adaptation by synthesizing distributionally robust expertsen_US
dc.typeConference Paperen_US
dc.identifier.spage10162en_US
dc.identifier.epage10172en_US
dc.identifier.volume139en_US
dcterms.abstractLeast squares estimators, when trained on few target domain samples, may predict poorly. Supervised domain adaptation aims to improve the predictive accuracy by exploiting additional labeled training samples from a source distribution that is close to the target distribution. Given available data, we investigate novel strategies to synthesize a family of least squares estimator experts that are robust with regard to moment conditions. When these moment conditions are specified using Kullback-Leibler or Wasserstein-type divergences, we can find the robust estimators efficiently using convex optimization. We use the Bernstein online aggregation algorithm on the proposed family of robust experts to generate predictions for the sequential stream of target test samples. Numerical experiments on real data show that the robust strategies systematically outperform non-robust interpolations of the empirical least squares estimators.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationProceedings of Machine Learning Research, 2021, v. 139, p. 10162-10172en_US
dcterms.isPartOfProceedings of Machine Learning Researchen_US
dcterms.issued2021-
dc.relation.conferenceInternational Conference on Machine Learning [ICML]en_US
dc.description.validate202305 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberAMA-0032-
dc.description.fundingSourceRGCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS55425991-
dc.description.oaCategoryCopyright retained by authoren_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
taskesen21a.pdf756.21 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

126
Last Week
7
Last month
Citations as of Nov 10, 2025

Downloads

30
Citations as of Nov 10, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.