A new recurrent neural network with noise-tolerance and finite-time convergence for dynamic quadratic minimization

Lin Xiao, Shuai Li, Jian Yang, Zhijun Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

35 Citations (Scopus)


To solve dynamic quadratic minimization, a nonlinearly activated integration design formula is first proposed in this paper with additive noises considered. Then, on the basis of such a design formula, a new recurrent neural network (RNN) is established to solve the dynamic quadratic minimization. Compared with the conventional Zhang neural network (ZNN) for this problem, the proposed RNN model possesses the outstanding finite-time convergence and the inherently noise-tolerant performance, and is thus called the versatile RNN (VRNN) model. In addition, the global stability, the finite-time convergence and the denoising ability of the VRNN model are proved by rigorous mathematical results in theory. The upper bound of the finite convergence time for the VRNN model is also analytically derived. Numerical simulative results are presented to validate the efficacy of the VRNN model, as well as its superior performance to the conventional ZNN model for dynamic quadratic minimization in the presence of various additive noises.

Original languageEnglish
Pages (from-to)125-132
Number of pages8
Publication statusPublished - 12 Apr 2018


  • Additive noises
  • Dynamic quadratic minimization
  • Finite-time convergence
  • Global stability
  • Recurrent neural network
  • Zhang neural network (ZNN)

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this