Abstract
This paper presents a novel and data-independent method to construct a type of partially connected feedforward neural network (FNN). The proposed networks, called Apollonian network-based partially connected FNNs (APFNNs), are constructed in terms of the structures of two-dimensional deterministic Apollonian networks. The APFNNs are then applied in various experiments to solve function approximation, forecasting and classification problems. Their results are compared with those generated by partially connected FNNs with random connectivity (RPFNNs), different learning algorithm-based traditional FNNs and other benchmark methods. The results demonstrate that the proposed APFNNs have a good capacity to fit complicated input and output relations, and provide better generalization performance than traditional FNNs and RPFNNs. The APFNNs also demonstrate faster training speed in each epoch than traditional FNNs.
Original language | English |
---|---|
Pages (from-to) | 5298-5307 |
Number of pages | 10 |
Journal | Physica A: Statistical Mechanics and its Applications |
Volume | 389 |
Issue number | 22 |
DOIs | |
Publication status | Published - 15 Dec 2010 |
Keywords
- Apollonian networks
- Feedforward neural networks
- Partially connected neural networks
- Randomly connected neural networks
ASJC Scopus subject areas
- Condensed Matter Physics
- Statistics and Probability