Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/22195
Title: Regularizations for stochastic linear variational inequalities
Authors: Zhang, Y
Chen, X 
Keywords: Epi-convergence
Expected residual minimization
Sample average approximations
Semismooth
Stochastic variational inequality
Issue Date: 2014
Source: Journal of optimization theory and applications, 2014, p. 1-22 How to cite?
Journal: Journal of Optimization Theory and Applications 
Abstract: This paper applies the Moreau-Yosida regularization to a convex expected residual minimization (ERM) formulation for a class of stochastic linear variational inequalities. To have the convexity of the corresponding sample average approximation (SAA) problem, we adopt the Tikhonov regularization. We show that any cluster point of minimizers of the Tikhonov regularization for the SAA problem is a minimizer of the ERM formulation with probability one as the sample size goes to infinity and the Tikhonov regularization parameter goes to zero. Moreover, we prove that the minimizer is the least {Mathematical expression}-norm solution of the ERM formulation. We also prove the semismoothness of the gradient of the Moreau-Yosida and Tikhonov regularizations for the SAA problem.
URI: http://hdl.handle.net/10397/22195
ISSN: 0022-3239
DOI: 10.1007/s10957-013-0514-2
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

35
Last Week
1
Last month
Checked on Jul 16, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.