A new fast learning algorithm with promising global convergence capability for feed-forward neural networks

Chi Chung Cheung, Sin Chun Ng, Andrew K. Lui, Sean Shensheng Xu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

2 Citations (Scopus)


Backpropagation (BP) learning algorithm is the most widely used supervised learning technique that is extensively applied in the training of multi-layer feed-forward neural networks. Although many modifications of BP have been proposed to speed up the learning of the original BP, they seldom address the local minimum and the flat-spot problem. This paper proposes a new algorithm called Local-minimum and Flat-spot Problem Solver (LFPS) to solve these two problems. It uses a systematic approach to check whether a learning process is trapped by a local minimum or a flat-spot area, and then escape from it. Thus, a learning process using LFPS can keep finding an appropriate way to converge to the global minimum. The performance investigation shows that the proposed algorithm always converges in different learning problems (applications) whereas other popular fast learning algorithms sometimes give very poor global convergence capabilities.

Original languageEnglish
Title of host publication2013 International Joint Conference on Neural Networks, IJCNN 2013
Publication statusPublished - 4 Aug 2013
Event2013 International Joint Conference on Neural Networks, IJCNN 2013 - Dallas, TX, United States
Duration: 4 Aug 20139 Aug 2013

Publication series

NameProceedings of the International Joint Conference on Neural Networks


Conference2013 International Joint Conference on Neural Networks, IJCNN 2013
Country/TerritoryUnited States
CityDallas, TX

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this