Abstract
This paper presents a globally convergent and locally superlinearly convergent method for solving a convex minimization problem whose objective function has a semismooth but nondifferentiable gradient. Applications to nonlinear minimax problems, stochastic programs with recourse, and their extensions are discussed.
Original language | English |
---|---|
Pages (from-to) | 633-648 |
Number of pages | 16 |
Journal | Journal of Optimization Theory and Applications |
Volume | 85 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Jun 1995 |
Externally published | Yes |
Keywords
- Newton method
- Nonsmooth optimization
ASJC Scopus subject areas
- Control and Optimization
- Management Science and Operations Research
- Applied Mathematics