## Abstract

In this paper, we focus on the constrained sparse regression problem, where the loss

function is convex but nonsmooth and the penalty term is defined by the cardinality function. First, we give an exact continuous relaxation problem in the sense that both problems have the same optimal solution set. Moreover, we show that a vector is a local minimizer with the lower bound property of

the original problem if and only if it is a lifted stationary point of the relaxation problem. Second, we propose a smoothing proximal gradient (SPG) algorithm for finding a lifted stationary point of the continuous relaxation model. Our algorithm is a novel combination of the classical proximal gradient algorithm and the smoothing method. We prove that the proposed SPG algorithm globally converges to a lifted stationary point of the relaxation problem, has the local convergence rate ofo(k - \tau) with \tau \in (0,1/2) on the objective function value, and identifies the zero entries of the lifted stationary point in finite iterations. Finally, we use three examples to illustrate the validity of the continuous relaxation model and good numerical performance of the SPG algorithm.

function is convex but nonsmooth and the penalty term is defined by the cardinality function. First, we give an exact continuous relaxation problem in the sense that both problems have the same optimal solution set. Moreover, we show that a vector is a local minimizer with the lower bound property of

the original problem if and only if it is a lifted stationary point of the relaxation problem. Second, we propose a smoothing proximal gradient (SPG) algorithm for finding a lifted stationary point of the continuous relaxation model. Our algorithm is a novel combination of the classical proximal gradient algorithm and the smoothing method. We prove that the proposed SPG algorithm globally converges to a lifted stationary point of the relaxation problem, has the local convergence rate ofo(k - \tau) with \tau \in (0,1/2) on the objective function value, and identifies the zero entries of the lifted stationary point in finite iterations. Finally, we use three examples to illustrate the validity of the continuous relaxation model and good numerical performance of the SPG algorithm.

Original language | English |
---|---|

Pages (from-to) | 858-883 |

Number of pages | 26 |

Journal | SIAM Journal on Numerical Analysis |

Volume | 58 |

Issue number | 1 |

DOIs | |

Publication status | Published - 27 Feb 2020 |

## Keywords

- nonsmooth convex regression
- cardinality penalty
- Proximal gradient method
- smoothing method
- global sequence convergence