Fast Convergence for Back-Propagation Network with Magnified Gradient Function

S. C. Ng, C. C. Cheung, S. H. Leung, A. Luk

Research output: Unpublished conference presentation (presented paper, abstract, poster)Conference presentation (not published in journal/proceeding/book)Academic researchpeer-review

33 Citations (Scopus)

Abstract

This paper presents a modified back-propagation algorithm using magnified gradient function (MGFPROP), which can effectively speed up the convergence rate and improve the global convergence capability of back-propagation. The purpose of MGFPROP is to increase the convergence rate by magnifying the gradient function of the activation function. From the convergence analysis, it is shown that the new algorithm retains the gradient descent property but gives faster convergence than that of the back-propagation algorithm. Simulation results show that, in terms of the convergence rate and the percentage of global convergence, the new algorithm always outperforms the standard back-propagation algorithm and other competing techniques.

Original languageEnglish
Pages1903-1908
Number of pages6
Publication statusPublished - 20 Jul 2003
EventInternational Joint Conference on Neural Networks 2003 - Portland, OR, United States
Duration: 20 Jul 200324 Jul 2003

Conference

ConferenceInternational Joint Conference on Neural Networks 2003
Country/TerritoryUnited States
CityPortland, OR
Period20/07/0324/07/03

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Fast Convergence for Back-Propagation Network with Magnified Gradient Function'. Together they form a unique fingerprint.

Cite this