An augmented Lagrangian method for non-Lipschitz nonconvex programming

Xiaojun Chen, Lei Guo, Zhaosong Lu, Jane J. Ye

Research output: Journal article publicationJournal articleAcademic researchpeer-review

13 Citations (Scopus)

Abstract

We consider a class of constrained optimization problems where the objective function is a sum of a smooth function and a nonconvex non-Lipschitz function. Many problems in sparse portfolio selection, edge preserving image restoration, and signal processing can be modelled in this form. First, we propose the concept of the Karush-Kuhn-Tucker (KKT) stationary condition for the non-Lipschitz problem and show that it is necessary for optimality under a constraint qualification called the relaxed constant positive linear dependence (RCPLD) condition, which is weaker than the Mangasarian-Fromovitz constraint qualification and holds automatically if all the constraint functions are affine. Then we propose an augmented Lagrangian (AL) method in which the augmented Lagrangian subproblems are solved by a nonmonotone proximal gradient method. Under the assumption that a feasible point is known, we show that any accumulation point of the sequence generated by our method must be a feasible point. Moreover, if RCPLD holds at such an accumulation point, then it is a KKT point of the original problem. Finally, we conduct numerical experiments to compare the performance of our AL method and the interior point (IP) method for solving two sparse portfolio selection models. The numerical results demonstrate that our method is not only comparable to the IP method in terms of solution quality, but also substantially faster than the IP method.
Original languageEnglish
Pages (from-to)168-193
Number of pages26
JournalSIAM Journal on Numerical Analysis
Volume55
Issue number1
DOIs
Publication statusPublished - 1 Jan 2017

Keywords

  • Augmented Lagrangian method
  • Non-Lipschitz programming
  • Sparse optimization

ASJC Scopus subject areas

  • Numerical Analysis

Cite this