Abstract
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiére algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
Original language | English |
---|---|
Pages (from-to) | 38-46 |
Number of pages | 9 |
Journal | Acta Mathematicae Applicatae Sinica |
Volume | 17 |
Issue number | 1 |
Publication status | Published - 1 Dec 2001 |
Externally published | Yes |
Keywords
- Conjugate gradient method
- Descent condition
- Global convergence
ASJC Scopus subject areas
- Applied Mathematics