Abstract
Based on the notion of the ε-subgradient, we present a unified technique to establish convergence properties of several methods for nonsmooth convex minimization problems. Starting from the technical results, we obtain the global convergence of: (i) the variable metric proximal methods presented by Bonnans, Gilbert, Lemaréchal, and Sagastizábal, (ii) some algorithms proposed by Correa and Lemaréchal, and (iii) the proximal point algorithm given by Rockafellar. In particular, we prove that the Rockafellar-Todd phenomenon does not occur for each of the above mentioned methods. Moreover, we explore the convergence rate of {∥Xk∥} and {f(xk)} when {xk} is unbounded and {f(xk)} is bounded for the nonsmooth minimization methods (i), (ii), and (iii).
Original language | English |
---|---|
Pages (from-to) | 141-158 |
Number of pages | 18 |
Journal | Applied Mathematics and Optimization |
Volume | 38 |
Issue number | 2 |
DOIs | |
Publication status | Published - 1 Jan 1998 |
Externally published | Yes |
Keywords
- Convergence rate
- Global convergence
- Nonsmooth convex minimization
ASJC Scopus subject areas
- Control and Optimization
- Applied Mathematics