Adding learning to cellular genetic algorithms for training recurrent neural networks

Kim Wing C. Ku, Man Wai Mak, Wan Chi Siu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

31 Citations (Scopus)

Abstract

This paper proposes a hybrid optimization algorithm which combines the efforts of local search (individual learning) and cellular genetic algorithms (GA's) for training recurrent neural networks (RNN's). Each weight of an RNN is encoded as a floating point number, and a concatenation of the numbers forms a chromosome. Reproduction takes place locally in a square grid with each grid point representing a chromosome. Two approaches, Lamarckian and Baldwinian mechanisms, for combining cellular GA's and learning have been compared. Different hill-climbing algorithms are incorporated into the cellular GA's as learning methods. These include the real-time recurrent learning (RTRL) and its simplified versions, and the delta rule. The RTRL algorithm has been successively simplified by freezing some of the weights to form simplified versions. The delta rule, which is the simplest form of learning, has been implemented by considering the RNN's as feedforward networks during learning. The hybrid algorithms are used to train the RNN's to solve a long-term dependency problem. The results show that Baldwinian learning is inefficient in assisting the cellular GA. It is conjectured that the more difficult it is for genetic operations to produce the genotypic changes that match the phenotypic changes due to learning, the poorer is the convergence of Baldwinian learning. Most of the combinations using the Lamarckian mechanism show an improvement in reducing the number of generations required for an optimum network; however, only a few can reduce the actual time taken. Embedding the delta rule in the cellular GA's has been found to be the fastest method. It is also concluded that learning should not be too extensive if the hybrid algorithm is to be benefit from learning.
Original languageEnglish
Pages (from-to)239-252
Number of pages14
JournalIEEE Transactions on Neural Networks
Volume10
Issue number2
DOIs
Publication statusPublished - 1 Dec 1999

Keywords

  • Baldwin effect
  • Genetic algorithms
  • Lamarckian learning
  • Real-time recurrent learning
  • Recurrent neural networks

ASJC Scopus subject areas

  • Software
  • General Medicine
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Adding learning to cellular genetic algorithms for training recurrent neural networks'. Together they form a unique fingerprint.

Cite this