Globally and superlinearly convergent algorithm for minimizing a normal merit function

Elijah Polak, Liqun Qi

Research output: Journal article publicationJournal articleAcademic researchpeer-review

6 Citations (Scopus)

Abstract

In this paper we present two new concepts related to the solution of systems of nonsmooth equations (NE) and variational inequalities (VI). The first concept is that of a normal merit function, which summarizes the simple basic properties shared by various known merit functions. In general, normal merit functions are locally Lipschitz, but not differentiable. The second concept is that of a Newtonian operator, whose values generalize the concept of the Hessian for normal merit functions. These two concepts are then used to generalize the nonsmooth Newton method for solving the equation ▽f(x) = 0, where f is a normal merit function with f ∈ C1, to the case where f is only locally Lipschitz and the set-valued inclusion 0 ∈ ∂f(x) needs to be Solved. Combining the resulting generalized Newton method with certain first-order methods, we obtain a globally and superlinearly convergent algorithm for minimizing normal merit functions.
Original languageEnglish
Pages (from-to)1005-1019
Number of pages15
JournalSIAM Journal on Control and Optimization
Volume36
Issue number3
DOIs
Publication statusPublished - 1 Jan 1998
Externally publishedYes

Keywords

  • First-order algorithm
  • Generalized newton method
  • Global convergence
  • Normal merit function
  • Superlinear convergence

ASJC Scopus subject areas

  • Control and Optimization
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Globally and superlinearly convergent algorithm for minimizing a normal merit function'. Together they form a unique fingerprint.

Cite this