Regularizations for Stochastic Linear Variational Inequalities

Yanfang Zhang, Xiaojun Chen

Research output: Journal article publicationJournal articleAcademic researchpeer-review

6 Citations (Scopus)

Abstract

This paper applies the Moreau–Yosida regularization to a convex expected residual minimization (ERM) formulation for a class of stochastic linear variational inequalities. To have the convexity of the corresponding sample average approximation (SAA) problem, we adopt the Tikhonov regularization. We show that any cluster point of minimizers of the Tikhonov regularization for the SAA problem is a minimizer of the ERM formulation with probability one as the sample size goes to infinity and the Tikhonov regularization parameter goes to zero. Moreover, we prove that the minimizer is the least (Formula presented.)-norm solution of the ERM formulation. We also prove the semismoothness of the gradient of the Moreau–Yosida and Tikhonov regularizations for the SAA problem.
Original languageEnglish
Pages (from-to)460-481
Number of pages22
JournalJournal of Optimization Theory and Applications
Volume163
Issue number2
DOIs
Publication statusPublished - 1 Jan 2014

Keywords

  • Epi-convergence
  • Expected residual minimization
  • Sample average approximations
  • Semismooth
  • Stochastic variational inequality

ASJC Scopus subject areas

  • Control and Optimization
  • Management Science and Operations Research
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Regularizations for Stochastic Linear Variational Inequalities'. Together they form a unique fingerprint.

Cite this