Linearized Proximal Algorithms with Adaptive Stepsizes for Convex Composite Optimization with Applications

Yaohua Hu, Chong Li, Jinhua Wang, Xiaoqi Yang, Linglingzhi Zhu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

1 Citation (Scopus)

Abstract

We propose an inexact linearized proximal algorithm with an adaptive stepsize, together with its globalized version based on the backtracking line-search, to solve a convex composite optimization problem. Under the assumptions of local weak sharp minima of order p(p≥1) for the outer convex function and a quasi-regularity condition for the inclusion problem associated to the inner function, we establish superlinear/quadratic convergence results for proposed algorithms. Compared to the linearized proximal algorithms with a constant stepsize proposed in Hu et al. (SIAM J Optim 26(2):1207–1235, 2016), our algorithms own broader applications and higher convergence rates, and the idea of analysis used in the present paper deviates significantly from that of Hu et al. (2016). Numerical applications to the nonnegative inverse eigenvalue problem and the wireless sensor network localization problem indicate that the proposed algorithms are more efficient and robust, and outperform the algorithms in Hu et al. (2016) and some popular algorithms for relevant problems.

Original languageEnglish
Article number52
Pages (from-to)1-35
Number of pages35
JournalApplied Mathematics and Optimization
Volume87
Issue number3
DOIs
Publication statusPublished - 13 Mar 2023

Keywords

  • Adaptive stepsize
  • Convex composite optimization
  • Convex inclusion problem
  • Linearized proximal algorithm
  • Quadratic convergence

ASJC Scopus subject areas

  • Control and Optimization
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Linearized Proximal Algorithms with Adaptive Stepsizes for Convex Composite Optimization with Applications'. Together they form a unique fingerprint.

Cite this