Please use this identifier to cite or link to this item:
Title: Regularizations for stochastic linear variational inequalities
Authors: Zhang, Y
Chen, X 
Keywords: Epi-convergence
Expected residual minimization
Sample average approximations
Stochastic variational inequality
Issue Date: 2014
Publisher: Springer
Source: Journal of optimization theory and applications, 2014, p. 1-22 How to cite?
Journal: Journal of optimization theory and applications 
Abstract: This paper applies the Moreau-Yosida regularization to a convex expected residual minimization (ERM) formulation for a class of stochastic linear variational inequalities. To have the convexity of the corresponding sample average approximation (SAA) problem, we adopt the Tikhonov regularization. We show that any cluster point of minimizers of the Tikhonov regularization for the SAA problem is a minimizer of the ERM formulation with probability one as the sample size goes to infinity and the Tikhonov regularization parameter goes to zero. Moreover, we prove that the minimizer is the least {Mathematical expression}-norm solution of the ERM formulation. We also prove the semismoothness of the gradient of the Moreau-Yosida and Tikhonov regularizations for the SAA problem.
ISSN: 0022-3239
EISSN: 1573-2878
DOI: 10.1007/s10957-013-0514-2
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

Last Week
Last month
Checked on Sep 17, 2017

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.