Abstract
We study a feed-forward neural network for two independent function approximation tasks. Upon training, two modules are automatically formed in the hidden layers, each handling one of the tasks predominantly. We demonstrate that the sizes of the modules can be dynamically driven by varying the complexities of the tasks. The network serves as a simple example of an artificial neural network with an adaptable modular structure. This study was motivated by related dynamical nature of modules in animal brains.
Original language | English |
---|---|
Pages (from-to) | 3673-3677 |
Number of pages | 5 |
Journal | Physical Review E - Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics |
Volume | 58 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Jan 1998 |
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Statistics and Probability
- Condensed Matter Physics