TY - JOUR
T1 - Convergence rate analysis of a sequential convex programming method with line search for a class of constrained difference-of-convex optimization problems
AU - Yu, Peiran
AU - Pong, Ting Kei
AU - Lu, Zhaosong
N1 - Funding Information:
\ast Received by the editors January 21, 2020; accepted for publication (in revised form) May 17, 2021; published electronically August 9, 2021. https://doi.org/10.1137/20M1314057 \bfF \bfu \bfn \bfd \bfi \bfn \bfg : The second author was supported partly by Hong Kong Research Grants Council PolyU153005/17p. \dagger Applied Mathematics, Hong Kong Polytechnic University, Kowloon, Hong Kong (peiran.yu@ connect.polyu.hk, [email protected]). \ddagger Department of Industrial and Systems Engineering, University of Minnesota, Minneapolis, MN 55455 USA ([email protected]).
Publisher Copyright:
© 2021 Society for Industrial and Applied Mathematics.
PY - 2021/8/9
Y1 - 2021/8/9
N2 - In this paper, we study the sequential convex programming method with monotone line search (SCPls) in [Z. Lu, Sequential Convex Programming Methods for a Class of Structured Nonlinear Programming, https://arxiv.org/abs/1210.3039, 2012] for a class of difference-of-convex optimization problems with multiple smooth inequality constraints. The SCPls is a representative variant of moving-ball-approximation-type algorithms [A. Auslender, R. Shefi, and M. Teboulle, SIAM J. Optim., 20 (2010), pp. 3232-3259; J. Bolte, Z. Chen, and E. Pauwels, Math. Program., 182 (2020) pp. 1-36; J. Bolte and E. Pauwels, Math. Oper. Res., 41 (2016), pp. 442-465; R. Shefi and M. Teboulle, Math. Program., 159 (2016), pp. 137-164] for constrained optimization problems. We analyze the convergence rate of the sequence generated by SCPls in both nonconvex and convex settings by imposing suitable Kurdyka-\Lojasiewicz (KL) assumptions. Specifically, in the nonconvex settings, we assume that a special potential function related to the objective and the constraints is a KL function, while in the convex settings we impose KL assumptions directly on the extended objective function (i.e., sum of the objective and the indicator function of the constraint set). A relationship between these two different KL assumptions is established in the convex settings under additional differentiability assumptions. We also discuss how to deduce the KL exponent of the extended objective function from its Lagrangian in the convex settings, under additional assumptions on the constraint functions. Thanks to this result, the extended objectives of some constrained optimization models such as minimizing ℓ1 subject to logistic/Poisson loss are found to be KL functions with exponent 1/2 under mild assumptions. To illustrate how our results can be applied, we consider SCPls for minimizing ℓ1-2 [P. Yin, Y. Lou, Q. He, and J. Xin, SIAM J. Sci. Comput., 37 (2015), pp. A536-A563] subject to residual error measured by ℓ2 norm/Lorentzian norm [R. E. Carrillo, A. B. Ramirez, G. R. Arce, K. E. Barner, and B. M. Sadler, EURASIP J. Adv. Signal Process. (2016), 108]. We first discuss how the various conditions required in our analysis can be verified, and then perform numerical experiments to illustrate the convergence behaviors of SCPls.
AB - In this paper, we study the sequential convex programming method with monotone line search (SCPls) in [Z. Lu, Sequential Convex Programming Methods for a Class of Structured Nonlinear Programming, https://arxiv.org/abs/1210.3039, 2012] for a class of difference-of-convex optimization problems with multiple smooth inequality constraints. The SCPls is a representative variant of moving-ball-approximation-type algorithms [A. Auslender, R. Shefi, and M. Teboulle, SIAM J. Optim., 20 (2010), pp. 3232-3259; J. Bolte, Z. Chen, and E. Pauwels, Math. Program., 182 (2020) pp. 1-36; J. Bolte and E. Pauwels, Math. Oper. Res., 41 (2016), pp. 442-465; R. Shefi and M. Teboulle, Math. Program., 159 (2016), pp. 137-164] for constrained optimization problems. We analyze the convergence rate of the sequence generated by SCPls in both nonconvex and convex settings by imposing suitable Kurdyka-\Lojasiewicz (KL) assumptions. Specifically, in the nonconvex settings, we assume that a special potential function related to the objective and the constraints is a KL function, while in the convex settings we impose KL assumptions directly on the extended objective function (i.e., sum of the objective and the indicator function of the constraint set). A relationship between these two different KL assumptions is established in the convex settings under additional differentiability assumptions. We also discuss how to deduce the KL exponent of the extended objective function from its Lagrangian in the convex settings, under additional assumptions on the constraint functions. Thanks to this result, the extended objectives of some constrained optimization models such as minimizing ℓ1 subject to logistic/Poisson loss are found to be KL functions with exponent 1/2 under mild assumptions. To illustrate how our results can be applied, we consider SCPls for minimizing ℓ1-2 [P. Yin, Y. Lou, Q. He, and J. Xin, SIAM J. Sci. Comput., 37 (2015), pp. A536-A563] subject to residual error measured by ℓ2 norm/Lorentzian norm [R. E. Carrillo, A. B. Ramirez, G. R. Arce, K. E. Barner, and B. M. Sadler, EURASIP J. Adv. Signal Process. (2016), 108]. We first discuss how the various conditions required in our analysis can be verified, and then perform numerical experiments to illustrate the convergence behaviors of SCPls.
KW - Constrained optimization
KW - Difference-of-convex optimization
KW - Kurdyka-Łojasiewicz property
KW - Linear convergence
UR - http://www.scopus.com/inward/record.url?scp=85114103520&partnerID=8YFLogxK
U2 - 10.1137/20M1314057
DO - 10.1137/20M1314057
M3 - Journal article
AN - SCOPUS:85114103520
SN - 1052-6234
VL - 31
SP - 2024
EP - 2054
JO - SIAM Journal on Optimization
JF - SIAM Journal on Optimization
IS - 3
ER -