Abstract
An interior quasi-subgradient method is proposed based on the proximal distance to solve constrained nondifferentiable quasi-convex optimization problems in Hilbert spaces. It is shown that a newly introduced generalized Gâteaux subdifferential is a subset of a quasi-subdifferential. The convergence properties, including the global convergence and iteration complexity, are investigated under the assumption of the Hölder condition of order p, when using the constant/diminishing/dynamic stepsize rules. Convergence rate results are obtained by assuming a Hölder-type weak sharp minimum condition relative to an induced proximal distance.
Original language | English |
---|---|
Pages (from-to) | 249-271 |
Number of pages | 23 |
Journal | Journal of Global Optimization |
Volume | 83 |
Issue number | 2 |
DOIs | |
Publication status | Published - Jun 2022 |
Keywords
- Convergence analysis
- Interior subgradient method
- Proximal distance
- Quasi-convex optimization
ASJC Scopus subject areas
- Computer Science Applications
- Control and Optimization
- Management Science and Operations Research
- Applied Mathematics