New quasi-Newton methods for unconstrained optimization problems

Zengxin Wei, Guoyin Li, Liqun Qi

Research output: Journal article publicationJournal articleAcademic researchpeer-review

175 Citations (Scopus)


Many methods for solving minimization problems are variants of Newton method, which requires the specification of the Hessian matrix of second derivatives. Quasi-Newton methods are intended for the situation where the Hessian is expensive or difficult to calculate. Quasi-Newton methods use only first derivatives to build an approximate Hessian over a number of iterations. This approximation is updated each iteration by a matrix of low rank. In unconstrained minimization, the original quasi-Newton equation is Bk+1sk= yk, where ykis the difference of the gradients at the last two iterates. In this paper, we first propose a new quasi-Newton equation Bk + 1sk= yk*in which yk*is decided by the sum of ykand Akskwhere Akis some matrix. Then we give two choices of Akwhich carry some second order information from the Hessian of the objective function. The three corresponding BFGS-TYPE algorithms are proved to possess global convergence property. The superlinear convergence of the one algorithm is proved. Extensive numerical experiments have been conducted which show that the proposed algorithms are very encouraging.
Original languageEnglish
Pages (from-to)1156-1188
Number of pages33
JournalApplied Mathematics and Computation
Issue number2
Publication statusPublished - 15 Apr 2006


  • Global convergence
  • Quasi-Newton equation
  • Quasi-Newton method
  • Superlinear convergence
  • Unconstrained optimization

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics


Dive into the research topics of 'New quasi-Newton methods for unconstrained optimization problems'. Together they form a unique fingerprint.

Cite this