Abstract
The ℓp regularization problem with 0 < p< 1 has been widely studied for finding sparse solutions of linear inverse problems and gained successful applications in various mathematics and applied science fields. The proximal gradient algorithm is one of the most popular algorithms for solving the ℓp regularisation problem. In the present paper, we investigate the linear convergence issue of one inexact descent method and two inexact proximal gradient algorithms (PGA). For this purpose, an optimality condition theorem is explored to provide the equivalences among a local minimum, second-order optimality condition and second-order growth property of the ℓp regularization problem. By virtue of the second-order optimality condition and second-order growth property, we establish the linear convergence properties of the inexact descent method and inexact PGAs under some simple assumptions. Both linear convergence to a local minimal value and linear convergence to a local minimum are provided. Finally, the linear convergence results of these methods are extended to the infinite-dimensional Hilbert spaces. Our results cannot be established under the framework of Kurdyka–Łojasiewicz theory.
Original language | English |
---|---|
Pages (from-to) | 853-883 |
Number of pages | 31 |
Journal | Journal of Global Optimization |
Volume | 79 |
Issue number | 4 |
DOIs | |
Publication status | Published - Apr 2021 |
Keywords
- Descent methods
- Inexact approach
- Linear convergence
- Nonconvex regularization
- Proximal gradient algorithms
- Sparse optimization
ASJC Scopus subject areas
- Computer Science Applications
- Management Science and Operations Research
- Control and Optimization
- Applied Mathematics