Abstract
This paper presents a modified back-propagation algorithm using magnified gradient function (MGFPROP), which can effectively speed up the convergence rate and improve the global convergence capability of back-propagation. The purpose of MGFPROP is to increase the convergence rate by magnifying the gradient function of the activation function. From the convergence analysis, it is shown that the new algorithm retains the gradient descent property but gives faster convergence than that of the back-propagation algorithm. Simulation results show that, in terms of the convergence rate and the percentage of global convergence, the new algorithm always outperforms the standard back-propagation algorithm and other competing techniques.
Original language | English |
---|---|
Pages | 1903-1908 |
Number of pages | 6 |
Publication status | Published - 20 Jul 2003 |
Event | International Joint Conference on Neural Networks 2003 - Portland, OR, United States Duration: 20 Jul 2003 → 24 Jul 2003 |
Conference
Conference | International Joint Conference on Neural Networks 2003 |
---|---|
Country/Territory | United States |
City | Portland, OR |
Period | 20/07/03 → 24/07/03 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence