Deterministic weight modification algorithm for efficient learning

S. C. Ng, C. C. Cheung, S. H. Leung

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

This paper presents a new approach using deterministic weight modification (DWM) to speed up the convergence rate effectively and improve the global convergence capability of the standard and modified back-propagation (BP) algorithms. The main idea of DWM is to reduce the system error by changing the weights of a multi-layered feed-forward neural network in a deterministic way. Simulation results show that the performance of DWM is better than BP and other modified BP algorithms for a number of learning problems.

Original languageEnglish
Title of host publication2004 IEEE International Joint Conference on Neural Networks - Proceedings
Pages1033-1038
Number of pages6
DOIs
Publication statusPublished - 25 Jul 2004
Externally publishedYes
Event2004 IEEE International Joint Conference on Neural Networks - Proceedings - Budapest, Hungary
Duration: 25 Jul 200429 Jul 2004

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
Volume2
ISSN (Print)1098-7576

Conference

Conference2004 IEEE International Joint Conference on Neural Networks - Proceedings
CountryHungary
CityBudapest
Period25/07/0429/07/04

ASJC Scopus subject areas

  • Software

Cite this