TY - JOUR
T1 - Doubly Iteratively Reweighted Algorithm for Constrained Compressed Sensing Models
AU - Sun, Shuqin
AU - Pong, Ting Kei
N1 - Funding Information:
Ting Kei Pong was supported in part by Hong Kong Research Grants Council PolyU153000/20p.
Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2023/3
Y1 - 2023/3
N2 - We propose a new algorithmic framework for constrained compressed sensing models that admit nonconvex sparsity-inducing regularizers including the log-penalty function as objectives, and nonconvex loss functions such as the Cauchy loss function and the Tukey biweight loss function in the constraint. Our framework employs iteratively reweighted ℓ1 and ℓ2 schemes to construct subproblems that can be efficiently solved by well-developed solvers for basis pursuit denoising such as SPGL1 by van den Berg and Friedlander (SIAM J Sci Comput 31:890-912, 2008). We propose a new termination criterion for the subproblem solvers that allows them to return an infeasible solution, with a suitably constructed feasible point satisfying a descent condition. The feasible point construction step is the key for establishing the well-definedness of our proposed algorithm, and we also prove that any accumulation point of this sequence of feasible points is a stationary point of the constrained compressed sensing model, under suitable assumptions. Finally, we compare numerically our algorithm (with subproblems solved by SPGL1 or the alternating direction method of multipliers) against the SCPls in Yu et al. (SIAM J Optim 31: 2024-2054, 2021) on solving constrained compressed sensing models with the log-penalty function as the objective and the Cauchy loss function in the constraint, for badly scaled measurement matrices. Our computational results show that our approaches return solutions with better recovery errors, and are always faster.
AB - We propose a new algorithmic framework for constrained compressed sensing models that admit nonconvex sparsity-inducing regularizers including the log-penalty function as objectives, and nonconvex loss functions such as the Cauchy loss function and the Tukey biweight loss function in the constraint. Our framework employs iteratively reweighted ℓ1 and ℓ2 schemes to construct subproblems that can be efficiently solved by well-developed solvers for basis pursuit denoising such as SPGL1 by van den Berg and Friedlander (SIAM J Sci Comput 31:890-912, 2008). We propose a new termination criterion for the subproblem solvers that allows them to return an infeasible solution, with a suitably constructed feasible point satisfying a descent condition. The feasible point construction step is the key for establishing the well-definedness of our proposed algorithm, and we also prove that any accumulation point of this sequence of feasible points is a stationary point of the constrained compressed sensing model, under suitable assumptions. Finally, we compare numerically our algorithm (with subproblems solved by SPGL1 or the alternating direction method of multipliers) against the SCPls in Yu et al. (SIAM J Optim 31: 2024-2054, 2021) on solving constrained compressed sensing models with the log-penalty function as the objective and the Cauchy loss function in the constraint, for badly scaled measurement matrices. Our computational results show that our approaches return solutions with better recovery errors, and are always faster.
KW - Compressed sensing
KW - Inexact subproblems
KW - Iteratively reweighted algorithms
UR - https://www.scopus.com/pages/publications/85150341555
U2 - 10.1007/s10589-023-00468-1
DO - 10.1007/s10589-023-00468-1
M3 - Journal article
AN - SCOPUS:85150341555
SN - 0926-6003
VL - 85
SP - 583
EP - 619
JO - Computational Optimization and Applications
JF - Computational Optimization and Applications
IS - 2
ER -