Abstract
We propose a first-order method to solve the cubic regularization subproblem (CRS) based on a novel reformulation. The reformulation is a constrained convex optimization problem whose feasible region admits an easily computable projection. Our reformulation requires computing the minimum eigenvalue of the Hessian. To avoid the expensive computation of the exact minimum eigenvalue, we develop a surrogate problem to the reformulation where the exact minimum eigenvalue is replaced with an approximate one. We then apply first-order methods such as the Nesterov’s accelerated projected gradient method (APG) and projected Barzilai-Borwein method to solve the surrogate problem. As our main theoretical contribution, we show that when an ϵ-approximate minimum eigenvalue is computed by the Lanczos method and the surrogate problem is approximately solved by APG, our approach returns an ϵ-approximate solution to CRS in O~ (ϵ- 1 / 2) matrix-vector multiplications (where O~ (·) hides the logarithmic factors). Numerical experiments show that our methods are comparable to and outperform the Krylov subspace method in the easy and hard cases, respectively. We further implement our methods as subproblem solvers of adaptive cubic regularization methods, and numerical results show that our algorithms are comparable to the state-of-the-art algorithms.
Original language | English |
---|---|
Pages (from-to) | 471-506 |
Number of pages | 36 |
Journal | Computational Optimization and Applications |
Volume | 79 |
Issue number | 2 |
DOIs | |
Publication status | Published - Jun 2021 |
Keywords
- Complexity analysis
- Constrained convex optimization
- Cubic regularization subproblem
- First-order methods
ASJC Scopus subject areas
- Control and Optimization
- Computational Mathematics
- Applied Mathematics