Addressing the local minima problem by output monitoring and modification algorithms

Sin Chun Ng, Chi Chung Cheung, Andrew Kwok Fai Lui, Hau Ting Tse

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

3 Citations (Scopus)

Abstract

This paper proposes a new approach called output monitoring and modification (OMM) to address the local minimum problem for existing gradient-descent algorithms (like BP, Rprop and Quickprop) in training feed-forward neural networks. OMM monitors the learning process. When the learning process is trapped into a local minimum, OMM changes some incorrect output values to escape from such local minimum. This modification can be repeated with different parameter settings until the learning process converges to the global optimum. The simulation experiments show that a gradient-descent learning algorithm with OMM has a much better global convergence capability than those without OMM but their convergence rates are similar. In one benchmark problem (application), the global convergence capability was increased from 1% to 100%.

Original languageEnglish
Title of host publicationAdvances in Neural Networks, ISNN 2012 - 9th International Symposium on Neural Networks, Proceedings
Pages206-216
Number of pages11
EditionPART 1
DOIs
Publication statusPublished - 11 Jul 2012
Event9th International Symposium on Neural Networks, ISNN 2012 - Shenyang, China
Duration: 11 Jul 201214 Jul 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume7367 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference9th International Symposium on Neural Networks, ISNN 2012
CountryChina
CityShenyang
Period11/07/1214/07/12

Keywords

  • back-propagation
  • local minimum problem
  • Quickprop
  • Rprop

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this