Abstract
A new algorithm is proposed to solve the `flat spot' problem in back-propagation neural networks by magnifying the gradient function. Simulation results show that, in terms of the convergence rate and the percentage of global convergence, the new algorithm consistently outperforms other traditional methods.
Original language | English |
---|---|
Pages (from-to) | 42-43 |
Number of pages | 2 |
Journal | Electronics Letters |
Volume | 37 |
Issue number | 1 |
DOIs | |
Publication status | Published - 4 Jan 2001 |
Externally published | Yes |
ASJC Scopus subject areas
- Electrical and Electronic Engineering