TY - JOUR
T1 - Linearized Proximal Algorithms with Adaptive Stepsizes for Convex Composite Optimization with Applications
AU - Hu, Yaohua
AU - Li, Chong
AU - Wang, Jinhua
AU - Yang, Xiaoqi
AU - Zhu, Linglingzhi
N1 - Funding Information:
The authors are grateful to the editor and the anonymous reviewers for their valuable comments and suggestions toward the improvement of this paper. Yaohua Hu’s work was supported in part by the National Natural Science Foundation of China (12222112, 12071306, 11871347, 32170655), Project of Educational Commission of Guangdong Province of China (2021KTSCX103, 2019KZDZX1007), and Natural Science Foundation of Shenzhen (JCYJ20190808173603590). Chong Li’s work was supported in part by the National Natural Science Foundation of China, 12071441 (11971429, 12071441). Jinhua Wang’s work was supported in part by the National Natural Science Foundation of China (Grant 12171131). Xiaoqi Yang’s work was supported in part by the Research Grants Council of Hong Kong (PolyU 15216518).
Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2023/3/13
Y1 - 2023/3/13
N2 - We propose an inexact linearized proximal algorithm with an adaptive stepsize, together with its globalized version based on the backtracking line-search, to solve a convex composite optimization problem. Under the assumptions of local weak sharp minima of order p(p≥1) for the outer convex function and a quasi-regularity condition for the inclusion problem associated to the inner function, we establish superlinear/quadratic convergence results for proposed algorithms. Compared to the linearized proximal algorithms with a constant stepsize proposed in Hu et al. (SIAM J Optim 26(2):1207–1235, 2016), our algorithms own broader applications and higher convergence rates, and the idea of analysis used in the present paper deviates significantly from that of Hu et al. (2016). Numerical applications to the nonnegative inverse eigenvalue problem and the wireless sensor network localization problem indicate that the proposed algorithms are more efficient and robust, and outperform the algorithms in Hu et al. (2016) and some popular algorithms for relevant problems.
AB - We propose an inexact linearized proximal algorithm with an adaptive stepsize, together with its globalized version based on the backtracking line-search, to solve a convex composite optimization problem. Under the assumptions of local weak sharp minima of order p(p≥1) for the outer convex function and a quasi-regularity condition for the inclusion problem associated to the inner function, we establish superlinear/quadratic convergence results for proposed algorithms. Compared to the linearized proximal algorithms with a constant stepsize proposed in Hu et al. (SIAM J Optim 26(2):1207–1235, 2016), our algorithms own broader applications and higher convergence rates, and the idea of analysis used in the present paper deviates significantly from that of Hu et al. (2016). Numerical applications to the nonnegative inverse eigenvalue problem and the wireless sensor network localization problem indicate that the proposed algorithms are more efficient and robust, and outperform the algorithms in Hu et al. (2016) and some popular algorithms for relevant problems.
KW - Adaptive stepsize
KW - Convex composite optimization
KW - Convex inclusion problem
KW - Linearized proximal algorithm
KW - Quadratic convergence
UR - http://www.scopus.com/inward/record.url?scp=85150190173&partnerID=8YFLogxK
U2 - 10.1007/s00245-022-09957-x
DO - 10.1007/s00245-022-09957-x
M3 - Journal article
AN - SCOPUS:85150190173
SN - 0095-4616
VL - 87
SP - 1
EP - 35
JO - Applied Mathematics and Optimization
JF - Applied Mathematics and Optimization
IS - 3
M1 - 52
ER -