Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications

J. Han, G. Liu, Defeng Sun, H. Yin

Research output: Journal article publicationJournal articleAcademic researchpeer-review

9 Citations (Scopus)


Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiére algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
Original languageEnglish
Pages (from-to)38-46
Number of pages9
JournalActa Mathematicae Applicatae Sinica
Issue number1
Publication statusPublished - 1 Dec 2001
Externally publishedYes


  • Conjugate gradient method
  • Descent condition
  • Global convergence

ASJC Scopus subject areas

  • Applied Mathematics

Cite this