TY - JOUR
T1 - Kurdyka–Łojasiewicz Exponent via Inf-projection
AU - Yu, Peiran
AU - Li, Guoyin
AU - Pong, Ting Kei
N1 - Funding Information:
Ting Kei Pong: This author was supported partly by Hong Kong Research Grants Council PolyU153005/17p.
Funding Information:
Guoyin Li: This author is partially supported by a Future fellowship from Australian Research Council (FT130100038) and a discovery project from Australian Research Council (DP190100555).
Publisher Copyright:
© 2021, SFoCM.
PY - 2021/7/9
Y1 - 2021/7/9
N2 - Kurdyka–Łojasiewicz (KL) exponent plays an important role in estimating the convergence rate of many contemporary first-order methods. In particular, a KL exponent of 12 for a suitable potential function is related to local linear convergence. Nevertheless, KL exponent is in general extremely hard to estimate. In this paper, we show under mild assumptions that KL exponent is preserved via inf-projection. Inf-projection is a fundamental operation that is ubiquitous when reformulating optimization problems via the lift-and-project approach. By studying its operation on KL exponent, we show that the KL exponent is 12 for several important convex optimization models, including some semidefinite-programming-representable functions and some functions that involve C2-cone reducible structures, under conditions such as strict complementarity. Our results are applicable to concrete optimization models such as group-fused Lasso and overlapping group Lasso. In addition, for nonconvex models, we show that the KL exponent of many difference-of-convex functions can be derived from that of their natural majorant functions, and the KL exponent of the Bregman envelope of a function is the same as that of the function itself. Finally, we estimate the KL exponent of the sum of the least squares function and the indicator function of the set of matrices of rank at most k.
AB - Kurdyka–Łojasiewicz (KL) exponent plays an important role in estimating the convergence rate of many contemporary first-order methods. In particular, a KL exponent of 12 for a suitable potential function is related to local linear convergence. Nevertheless, KL exponent is in general extremely hard to estimate. In this paper, we show under mild assumptions that KL exponent is preserved via inf-projection. Inf-projection is a fundamental operation that is ubiquitous when reformulating optimization problems via the lift-and-project approach. By studying its operation on KL exponent, we show that the KL exponent is 12 for several important convex optimization models, including some semidefinite-programming-representable functions and some functions that involve C2-cone reducible structures, under conditions such as strict complementarity. Our results are applicable to concrete optimization models such as group-fused Lasso and overlapping group Lasso. In addition, for nonconvex models, we show that the KL exponent of many difference-of-convex functions can be derived from that of their natural majorant functions, and the KL exponent of the Bregman envelope of a function is the same as that of the function itself. Finally, we estimate the KL exponent of the sum of the least squares function and the indicator function of the set of matrices of rank at most k.
KW - Convergence rate
KW - First-order methods
KW - Inf-projection
KW - Kurdyka–Łojasiewicz exponent
KW - Kurdyka–Łojasiewicz inequality
UR - http://www.scopus.com/inward/record.url?scp=85110282050&partnerID=8YFLogxK
U2 - 10.1007/s10208-021-09528-6
DO - 10.1007/s10208-021-09528-6
M3 - Journal article
AN - SCOPUS:85110282050
SN - 1615-3375
VL - 22
SP - 1171
EP - 1217
JO - Foundations of Computational Mathematics
JF - Foundations of Computational Mathematics
IS - 4
ER -