A preconditioning proximal Newton method for nondifferentiable convex optimization

Liqun Qi, Xiaojun Chen

Research output: Journal article publicationJournal articleAcademic researchpeer-review

28 Citations (Scopus)


We propose a proximal Newton method for solving nondifferentiable convex optimization. This method combines the generalized Newton method with Rockafellar's proximal point algorithm. At each step, the proximal point is found approximately and the regularization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some accuracy and the second-order differentiability properties of the Moreau-Yosida regularization are invariant with respect to this preconditioning. Based upon these, superlinear convergence is established under a semismoothness condition.
Original languageEnglish
Pages (from-to)411-429
Number of pages19
JournalMathematical Programming, Series B
Issue number3
Publication statusPublished - 1 Mar 1997
Externally publishedYes


  • Newton's method
  • Nondifferentiable convex optimization
  • Proximal point
  • Superlinear convergence

ASJC Scopus subject areas

  • Software
  • General Mathematics


Dive into the research topics of 'A preconditioning proximal Newton method for nondifferentiable convex optimization'. Together they form a unique fingerprint.

Cite this