TY - GEN
T1 - Neural network training using stochastic PSO
AU - Chen, Xin
AU - Li, Yangmin
PY - 2006/1/1
Y1 - 2006/1/1
N2 - Particle swarm optimization is widely applied for training neural network. Since in many applications the number of weights of NN is huge, when PSO algorithms are applied for NN training, the dimension of search space is so large that PSOs always converge prematurely. In this paper an improved stochastic PSO (SPSO) is presented, to which a random velocity is added to improve particles' exploration ability. Since SPSO explores much thoroughly to collect information of solution space, it is able to find the global best solution with high opportunity. Hence SPSO is suitable for optimization about high dimension problems, especially for NN training.
AB - Particle swarm optimization is widely applied for training neural network. Since in many applications the number of weights of NN is huge, when PSO algorithms are applied for NN training, the dimension of search space is so large that PSOs always converge prematurely. In this paper an improved stochastic PSO (SPSO) is presented, to which a random velocity is added to improve particles' exploration ability. Since SPSO explores much thoroughly to collect information of solution space, it is able to find the global best solution with high opportunity. Hence SPSO is suitable for optimization about high dimension problems, especially for NN training.
UR - http://www.scopus.com/inward/record.url?scp=33750736050&partnerID=8YFLogxK
M3 - Conference article published in proceeding or book
SN - 3540464816
SN - 9783540464815
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 1051
EP - 1060
BT - Neural Information Processing - 13th International Conference, ICONIP 2006, Proceedings
PB - Springer Verlag
T2 - 13th International Conference on Neural Information Processing, ICONIP 2006
Y2 - 3 October 2006 through 6 October 2006
ER -