Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98575
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Mathematicsen_US
dc.creatorNguyen, VAen_US
dc.creatorShafieezadeh-Abadeh, Sen_US
dc.creatorYue, MCen_US
dc.creatorKuhn, Den_US
dc.creatorWiesemann, Wen_US
dc.date.accessioned2023-05-10T02:00:25Z-
dc.date.available2023-05-10T02:00:25Z-
dc.identifier.isbn978-1-7138-0793-3 (Print on Demand(PoD))en_US
dc.identifier.urihttp://hdl.handle.net/10397/98575-
dc.description33rd Conference on Neural Information Processing Systems (NeurIPS 2019), 8-14 Dec 2019, Vancouver, Canadaen_US
dc.language.isoenen_US
dc.publisherNeurIPSen_US
dc.rightsCopyright © (2019) by individual authors and Neural Information Processing Systems Foundation Inc.en_US
dc.rightsPosted with permission of the author.en_US
dc.titleOptimistic distributionally robust optimization for nonparametric likelihood approximationen_US
dc.typeConference Paperen_US
dc.identifier.spage15793en_US
dc.identifier.epage15803en_US
dc.identifier.volume20en_US
dcterms.abstractThe likelihood function is a fundamental component in Bayesian statistics. However, evaluating the likelihood of an observation is computationally intractable in many applications. In this paper, we propose a non-parametric approximation of the likelihood that identifies a probability measure which lies in the neighborhood of the nominal measure and that maximizes the probability of observing the given sample point. We show that when the neighborhood is constructed by the Kullback-Leibler divergence, by moment conditions or by the Wasserstein distance, then our optimistic likelihood can be determined through the solution of a convex optimization problem, and it admits an analytical expression in particular cases. We also show that the posterior inference problem with our optimistic likelihood approximation enjoys strong theoretical performance guarantees, and it performs competitively in a probabilistic classification task.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationAdvances in Neural Information Processing Systems 32 (NeurIPS 2019), 2019, v. 20, p. 15793-15803en_US
dcterms.issued2019-
dc.relation.conferenceConference on Neural Information Processing Systems [NeurIPS]en_US
dc.description.validate202305 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberAMA-0242-
dc.description.fundingSourceOthersen_US
dc.description.fundingTextEPSRCen_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS23269977-
dc.description.oaCategoryCopyright retained by authoren_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
Nguyen_Optimistic_Distributionally_Robust.pdf503.73 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

40
Citations as of Jul 14, 2024

Downloads

11
Citations as of Jul 14, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.