A new adaptive learning algorithm using magnified gradient function

S. C. Ng, C. C. Cheung, S. H. Leung, A. Luk

Research output: Unpublished conference presentation (presented paper, abstract, poster)Conference presentation (not published in journal/proceeding/book)Academic researchpeer-review

3 Citations (Scopus)

Abstract

In this paper, a new algorithm is proposed to solve the "flat spot" problem in back-propagation networks by magnifying the gradient function. The idea of the new learning algorithm is to vary the gradient of the activation function so as to magnify the backward propagated error signal gradient function especially when the output approaches a wrong value, thus the convergence rate can be accelerated and the flat spot problem can be eliminated. Simulation results show that, in terms of the convergence rate and global search capability, the new algorithm always outperforms the other traditional methods.

Original languageEnglish
Pages156-159
Number of pages4
Publication statusPublished - 15 Jul 2001
Externally publishedYes
EventInternational Joint Conference on Neural Networks (IJCNN'01) - Washington, DC, United States
Duration: 15 Jul 200119 Jul 2001

Conference

ConferenceInternational Joint Conference on Neural Networks (IJCNN'01)
Country/TerritoryUnited States
CityWashington, DC
Period15/07/0119/07/01

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A new adaptive learning algorithm using magnified gradient function'. Together they form a unique fingerprint.

Cite this