Abstract
In this paper, we investigate an inexact quasisubgradient method with extrapolation for solving a quasiconvex optimization problem with a closed, convex and bounded constraint set. We establish the convergence in objective values, iteration complexity and rate of convergence for our proposed method under Hölder condition and weak sharp minima condition. When both diminishing stepsize and extrapolation stepsize are decaying as a power function, we obtain explicit iteration complexities. When diminishing stepsize is decaying as a power function and the extrapolation stepsize is decreasing not less than a power function, the diminishing stepsize provides a rate of convergence O(τks)(s∈(0,1)) to an optimal solution or to a ball of the optimal solution set, which is faster than O(1 / kβ) (for each β> 0). With geometrically decreasing extrapolation stepsize, we obtain a linear rate of convergence to a ball of the optimal solution set for the constant stepsize and dynamic stepsize. Our numerical testing shows that the performance with extrapolation is much more efficient than that without extrapolation in terms of the number of iterations needed for reaching an approximate optimal solution.
Original language | English |
---|---|
Pages (from-to) | 676-703 |
Number of pages | 28 |
Journal | Journal of Optimization Theory and Applications |
Volume | 193 |
Issue number | 1-3 |
DOIs | |
Publication status | Published - Mar 2022 |
Keywords
- Extrapolation
- Iteration complexity
- Quasiconvex
- Quasisubgradient
- Rate of convergence
ASJC Scopus subject areas
- Management Science and Operations Research
- Control and Optimization
- Applied Mathematics