TY - GEN
T1 - A new stochastic PSO technique for neural network training
AU - Li, Yangmin
AU - Chen, Xin
PY - 2006/1/1
Y1 - 2006/1/1
N2 - Recently, Particle Swarm Optimization(PSO) has been widely applied for training neural network. To improve the performance of PSO for high-dimensional solution space which always occurs in training NN, this paper introduces a new paradigm of particle swarm optimization named stochastic PSO (S-PSO). The feature of the S-PSO is its high ability for exploration. Consequently, when swarm size is relatively small, S-PSO performs much better than traditional PSO in training of NN. Hence if S-PSO is used to realize training of NN, computational cost of training can be reduced significantly.
AB - Recently, Particle Swarm Optimization(PSO) has been widely applied for training neural network. To improve the performance of PSO for high-dimensional solution space which always occurs in training NN, this paper introduces a new paradigm of particle swarm optimization named stochastic PSO (S-PSO). The feature of the S-PSO is its high ability for exploration. Consequently, when swarm size is relatively small, S-PSO performs much better than traditional PSO in training of NN. Hence if S-PSO is used to realize training of NN, computational cost of training can be reduced significantly.
UR - http://www.scopus.com/inward/record.url?scp=33745913897&partnerID=8YFLogxK
M3 - Conference article published in proceeding or book
SN - 354034439X
SN - 9783540344391
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 564
EP - 569
BT - Advances in Neural Networks - ISNN 2006
PB - Springer Verlag
T2 - 3rd International Symposium on Neural Networks, ISNN 2006 - Advances in Neural Networks
Y2 - 28 May 2006 through 1 June 2006
ER -