A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems

Xudong Li, Defeng Sun, Kim Chuan Toh

Research output: Journal article publicationJournal articleAcademic researchpeer-review

111 Citations (Scopus)

Abstract

We develop a fast and robust algorithm for solving large-scale convex composite optimization models with an emphasis on the '1-regularized least squares regression (lasso) problems. Despite the fact that there exist a large number of solvers in the literature for the lasso problems, we found that no solver can efficiently handle difficult large-scale regression problems with real data. By leveraging on available error bound results to realize the asymptotic superlinear convergence property of the augmented Lagrangian algorithm, and by exploiting the second order sparsity of the problem through the semismooth Newton method, we are able to propose an algorithm, called Ssnal, to efficiently solve the aforementioned difficult problems. Under very mild conditions, which hold automatically for lasso problems, both the primal and the dual iteration sequences generated by Ssnal possess a fast linear convergence rate, which can even be superlinear asymptotically. Numerical comparisons between our approach and a number of state-of-the-art solvers, on real data sets, are presented to demonstrate the high efficiency and robustness of our proposed algorithm in solving difficult large-scale lasso problems.

Original languageEnglish
Pages (from-to)433-458
Number of pages26
JournalSIAM Journal on Optimization
Volume28
Issue number1
DOIs
Publication statusPublished - 2018

Keywords

  • Augmented Lagrangian
  • Lasso
  • Metric subregularity
  • Newton's method
  • Semismoothness
  • Sparse optimization

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems'. Together they form a unique fingerprint.

Cite this