A globally and superlinearly convergent algorithm for nonsmooth convex minimization

Masao Fukushima, Liqun Qi

Research output: Journal article publicationJournal articleAcademic researchpeer-review

77 Citations (Scopus)


It is well known that a possibly nondifferentiable convex minimization problem can be transformed into a differentiable convex minimization problem by way of the Moreau-Yosida regularization. This paper presents a globally convergent algorithm that is designed to solve the latter problem. Under additional semismoothness and regularity assumptions, the proposed algorithm is shown to have a Q-superlinear rate of convergence.
Original languageEnglish
Pages (from-to)1106-1120
Number of pages15
JournalSIAM Journal on Optimization
Issue number4
Publication statusPublished - 1 Jan 1996
Externally publishedYes


  • Global convergence
  • Moreau-Yosida regularization
  • Nonsmooth convex optimization
  • Semismoothness
  • Superlinear convergence

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science


Dive into the research topics of 'A globally and superlinearly convergent algorithm for nonsmooth convex minimization'. Together they form a unique fingerprint.

Cite this