Abstract
In this paper, we discuss the convergence properties of a class of descent algorithms for minimizing a continuously differentiable function f on Rnwithout assuming that the sequence {xk} of iterates is bounded. Under mild conditions, we prove that the limit infimum of ∥▽f(xk)∥ is zero and that false convergence does not occur when f is convex. Furthermore, we discuss the convergence rate of {∥xk∥} and {f(xk)} when {xk} is unbounded and {f(xk)} is bounded.
Original language | English |
---|---|
Pages (from-to) | 177-188 |
Number of pages | 12 |
Journal | Journal of Optimization Theory and Applications |
Volume | 95 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1 Jan 1997 |
Externally published | Yes |
Keywords
- Descent methods
- Global convergence
- Rate of convergence
- Unconstrained differentiable minimization
ASJC Scopus subject areas
- Control and Optimization
- Management Science and Operations Research
- Applied Mathematics