Abstract
This paper is concerned with ell q (0 q 1)-norm regularized minimization problems with a twice continuously differentiable loss function. For this class of nonconvex and nonsmooth composite problems, many algorithms have been proposed to solve them, most of which are of the first-order type. In this work, we propose a hybrid of the proximal gradient method and the subspace regularized Newton method, called HpgSRN. The whole iterate sequence produced by HpgSRN is proved to have a finite length and to converge to an L-Type stationary point under a mild curveratio condition and the Kurdyka-Lojasiewicz property of the cost function; it converges linearly if a further Kurdyka-Lojasiewicz property of exponent 1/2 holds. Moreover, a superlinear convergence rate for the iterate sequence is also achieved under an additional local error bound condition. Our convergence results do not require the isolatedness and strict local minimality properties of the L-stationary point. Numerical comparisons with ZeroFPR, a hybrid of proximal gradient method and quasi-Newton method for the forward-backward envelope of the cost function, proposed in [A. Themelis, L. Stella, and P. Patrinos, SIAM J. Optim., 28 (2018), pp. 2274-2303] for the ell q-norm regularized linear and logistic regressions on real data, indicate that HpgSRN not only requires much less computing time but also yields comparable or even better sparsities and objective function values.
Original language | English |
---|---|
Pages (from-to) | 1676-1706 |
Number of pages | 31 |
Journal | SIAM Journal on Optimization |
Volume | 33 |
Issue number | 3 |
DOIs | |
Publication status | Published - Sept 2023 |
Keywords
- ell q-norm regularized composite optimization
- global convergence
- KL property
- local error bound
- regularized Newton method
- superlinear convergence rate
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Applied Mathematics