An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems

Yangjing Zhang, Ning Zhang, Defeng Sun, Kim Chuan Toh

Research output: Journal article publicationJournal articleAcademic researchpeer-review

24 Citations (Scopus)


The sparse group Lasso is a widely used statistical model which encourages the sparsity both on a group and within the group level. In this paper, we develop an efficient augmented Lagrangian method for large-scale non-overlapping sparse group Lasso problems with each subproblem being solved by a superlinearly convergent inexact semismooth Newton method. Theoretically, we prove that, if the penalty parameter is chosen sufficiently large, the augmented Lagrangian method converges globally at an arbitrarily fast linear rate for the primal iterative sequence, the dual infeasibility, and the duality gap of the primal and dual objective functions. Computationally, we derive explicitly the generalized Jacobian of the proximal mapping associated with the sparse group Lasso regularizer and exploit fully the underlying second order sparsity through the semismooth Newton method. The efficiency and robustness of our proposed algorithm are demonstrated by numerical experiments on both the synthetic and real data sets.

Original languageEnglish
Pages (from-to)223-263
Number of pages41
JournalMathematical Programming
Issue number1-2
Publication statusPublished - Jan 2020


  • Augmented Lagrangian method
  • Generalized Jacobian
  • Semismooth Newton method
  • Sparse group Lasso

ASJC Scopus subject areas

  • Software
  • General Mathematics


Dive into the research topics of 'An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems'. Together they form a unique fingerprint.

Cite this