Abstract
It is well known that a possibly nondifferentiable convex minimization problem can be transformed into a differentiable convex minimization problem by way of the Moreau-Yosida regularization. This paper presents a globally convergent algorithm that is designed to solve the latter problem. Under additional semismoothness and regularity assumptions, the proposed algorithm is shown to have a Q-superlinear rate of convergence.
Original language | English |
---|---|
Pages (from-to) | 1106-1120 |
Number of pages | 15 |
Journal | SIAM Journal on Optimization |
Volume | 6 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Jan 1996 |
Externally published | Yes |
Keywords
- Global convergence
- Moreau-Yosida regularization
- Nonsmooth convex optimization
- Semismoothness
- Superlinear convergence
ASJC Scopus subject areas
- Software
- Theoretical Computer Science