Abstract
This study aims to minimize the sum of a smooth function and a nonsmooth ℓ1-regularized term. This problem as a special case includes the ℓ1-regularized convex minimization problem in signal processing, compressive sensing, machine learning, data mining, and so on. However, the non-differentiability of the ℓ1-norm causes more challenges especially in large problems encountered in many practical applications. This study proposes, analyzes, and tests a Barzilai-Borwein gradient algorithm. At each iteration, the generated search direction demonstrates descent property and can be easily derived by minimizing a local approximal quadratic model and simultaneously taking the favorable structure of the ℓ1-norm. A nonmonotone line search technique is incorporated to find a suitable stepsize along this direction. The algorithm is easily performed, where each iteration requiring the values of the objective function and the gradient of the smooth term. Under some conditions, the proposed algorithm appears globally convergent. The limited experiments using some nonconvex unconstrained problems from the CUTEr library with additive ℓ1-regularization illustrate that the proposed algorithm performs quite satisfactorily. Extensive experiments for ℓ1-regularized least squares problems in compressive sensing verify that our algorithm compares favorably with several state-of-the-art algorithms that have been specifically designed in recent years.
Original language | English |
---|---|
Pages (from-to) | 17-41 |
Number of pages | 25 |
Journal | Journal of Scientific Computing |
Volume | 61 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1 Jan 2014 |
Keywords
- Barzilai-Borwein gradient algorithm
- Compressive sensing
- Nonconvex optimization
- Nonmonotone line search
- Nonsmooth optimization
- ℓ 1 regularization
ASJC Scopus subject areas
- Theoretical Computer Science
- Software
- Engineering(all)
- Computational Theory and Mathematics