A systematic algorithm to escape from local minima in training feed-forward neural networks

Chi Chung Cheung, Sean Shensheng Xu, Sin Chun Ng

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

A learning process is easily trapped into a local minimum when training multi-layer feed-forward neural networks. An algorithm called Wrong Output Modification (WOM) was proposed to help a learning process escape from local minima, but WOM still cannot totally solve the local minimum problem. Moreover, there is no performance analysis to show that the learning has a higher probability of converging to a global solution by using this algorithm. Additionally, the generalization performance of this algorithm was not investigated when the early stopping method of training is applied. Based on these limitations of WOM, we propose a new algorithm to ensure the learning can escape from local minima, and its performance is analyzed. We also test the generalization performance of this new algorithm when the early stopping method of training is applied.

Original languageEnglish
Title of host publication2016 International Joint Conference on Neural Networks, IJCNN 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages396-402
Number of pages7
ISBN (Electronic)9781509006199
DOIs
Publication statusPublished - 24 Jul 2016
Event2016 International Joint Conference on Neural Networks, IJCNN 2016 - Vancouver Convention Centre, Vancouver, Canada
Duration: 24 Jul 201629 Jul 2016

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2016-October

Conference

Conference2016 International Joint Conference on Neural Networks, IJCNN 2016
CountryCanada
CityVancouver
Period24/07/1629/07/16

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this