Abstract
This paper focuses on the minimization of a sum of a twice continuously differentiable function f and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of f involving the ρth power of the KKT residual. For ρ = 0, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For ρ ∈ (0, 1), by assuming that cluster points satisfy a locally Hölderian error bound of order q on a second-order stationary point set and a local error bound of order q > 1 + ρ on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on q and ρ. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on l_1-regularized Student’s t-regressions, group penalized Student’s t-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.
| Original language | English |
|---|---|
| Pages (from-to) | 603-641 |
| Number of pages | 38 |
| Journal | Computational Optimization and Applications |
| Volume | 88 |
| Issue number | 2 |
| Early online date | 20 Feb 2024 |
| DOIs | |
| Publication status | Published - 20 Feb 2024 |
Keywords
- Nonconvex and nonsmooth optimization
- Regularized proximal Newton method
- Global convergence
- Convergence rate
- KL function
- Metric q-subregularity
ASJC Scopus subject areas
- Control and Optimization
- Computational Mathematics
- Applied Mathematics
Fingerprint
Dive into the research topics of 'An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver