A globally convergent Newton method for convex SC1 minimization problems

J. S. Pang, Liqun Qi

Research output: Journal article publicationJournal articleAcademic researchpeer-review

52 Citations (Scopus)

Abstract

This paper presents a globally convergent and locally superlinearly convergent method for solving a convex minimization problem whose objective function has a semismooth but nondifferentiable gradient. Applications to nonlinear minimax problems, stochastic programs with recourse, and their extensions are discussed.
Original languageEnglish
Pages (from-to)633-648
Number of pages16
JournalJournal of Optimization Theory and Applications
Volume85
Issue number3
DOIs
Publication statusPublished - 1 Jun 1995
Externally publishedYes

Keywords

  • Newton method
  • Nonsmooth optimization

ASJC Scopus subject areas

  • Control and Optimization
  • Management Science and Operations Research
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A globally convergent Newton method for convex SC1 minimization problems'. Together they form a unique fingerprint.

Cite this