A Smoothing Proximal Gradient Algorithm for Nonsmooth Convex Regression with Cardinality Penalty

Wei Bian, Xiaojun Chen

Research output: Journal article publicationJournal articleAcademic researchpeer-review


In this paper, we focus on the constrained sparse regression problem, where the loss
function is convex but nonsmooth and the penalty term is defined by the cardinality function. First, we give an exact continuous relaxation problem in the sense that both problems have the same optimal solution set. Moreover, we show that a vector is a local minimizer with the lower bound property of
the original problem if and only if it is a lifted stationary point of the relaxation problem. Second, we propose a smoothing proximal gradient (SPG) algorithm for finding a lifted stationary point of the continuous relaxation model. Our algorithm is a novel combination of the classical proximal gradient algorithm and the smoothing method. We prove that the proposed SPG algorithm globally converges to a lifted stationary point of the relaxation problem, has the local convergence rate ofo(k - \tau) with \tau \in (0,1/2) on the objective function value, and identifies the zero entries of the lifted stationary point in finite iterations. Finally, we use three examples to illustrate the validity of the continuous relaxation model and good numerical performance of the SPG algorithm.
Original languageEnglish
Pages (from-to)858-883
Number of pages26
JournalSIAM Journal on Numerical Analysis
Issue number1
Publication statusPublished - 27 Feb 2020


  • nonsmooth convex regression
  • cardinality penalty
  • Proximal gradient method
  • smoothing method
  • global sequence convergence

Cite this