Smoothing methods for nonsmooth, nonconvex minimization

Research output: Journal article publicationJournal articleAcademic researchpeer-review

217 Citations (Scopus)

Abstract

We consider a class of smoothing methods for minimization problems where the feasible set is convex but the objective function is not convex, not differentiable and perhaps not even locally Lipschitz at the solutions. Such optimization problems arise from wide applications including image restoration, signal reconstruction, variable selection, optimal control, stochastic equilibrium and spherical approximations. In this paper, we focus on smoothing methods for solving such optimization problems, which use the structure of the minimization problems and composition of smoothing functions for the plus function (x)+.Many existing optimization algorithms and codes can be used in the inner iteration of the smoothing methods. We present properties of the smoothing functions and the gradient consistency of subdifferential associated with a smoothing function. Moreover, we describe how to update the smoothing parameter in the outer iteration of the smoothing methods to guarantee convergence of the smoothing methods to a stationary point of the original minimization problem.
Original languageEnglish
Pages (from-to)71-99
Number of pages29
JournalMathematical Programming
Volume134
Issue number1
DOIs
Publication statusPublished - 1 Aug 2012

Keywords

  • Eigenvalue optimization
  • Nonconvex minimization
  • Nonsmooth
  • Regularized minimization problems
  • Smoothing methods
  • Stochastic variational inequality problems

ASJC Scopus subject areas

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Smoothing methods for nonsmooth, nonconvex minimization'. Together they form a unique fingerprint.

Cite this