TY - GEN
T1 - The multi-phase method in fast learning algorithms
AU - Cheung, Chi Chung
AU - Ng, Sin Chun
PY - 2009/6/14
Y1 - 2009/6/14
N2 - Backpropagation (BP) learning algorithm is the most widely supervised learning technique which is extensively applied in the training of multi-layer feed-forward neural networks. Many modifications that have been proposed to improve the performance of BP have focused on solving the "flat spot" problem to increase the convergence rate. However, their performance is limited due to the error overshooting problem, In [20], a novel approach called BP with Two-Phase Magnified Gradient Function (2P-MGFPROP) was introduced to overcome the error overshooting problem and hence speed up the convergence rate of MGFPROP. In this paper, this approach is further enhanced by proposing to divide the learning process into multiple phases, and different fast learning algorithms are assigned in different phases to improve the convergence rate in different adaptive problems. Through the performance investigation, it is found that the convergence rate can be increased up to two times, compared with existing fast learning algorithms.
AB - Backpropagation (BP) learning algorithm is the most widely supervised learning technique which is extensively applied in the training of multi-layer feed-forward neural networks. Many modifications that have been proposed to improve the performance of BP have focused on solving the "flat spot" problem to increase the convergence rate. However, their performance is limited due to the error overshooting problem, In [20], a novel approach called BP with Two-Phase Magnified Gradient Function (2P-MGFPROP) was introduced to overcome the error overshooting problem and hence speed up the convergence rate of MGFPROP. In this paper, this approach is further enhanced by proposing to divide the learning process into multiple phases, and different fast learning algorithms are assigned in different phases to improve the convergence rate in different adaptive problems. Through the performance investigation, it is found that the convergence rate can be increased up to two times, compared with existing fast learning algorithms.
UR - http://www.scopus.com/inward/record.url?scp=70449429569&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2009.5178684
DO - 10.1109/IJCNN.2009.5178684
M3 - Conference article published in proceeding or book
AN - SCOPUS:70449429569
SN - 9781424435531
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 552
EP - 559
BT - 2009 International Joint Conference on Neural Networks, IJCNN 2009
T2 - 2009 International Joint Conference on Neural Networks, IJCNN 2009
Y2 - 14 June 2009 through 19 June 2009
ER -