Abstract
Multilayer feedforward networks (MFNs) are a class of ANN model that has been widely used in the field of pattern classification. Training in the model is usually accomplished by the well known backpropagation algorithm. This is a steepest-descent-optimization that can get 'stuck' in local minima and which results in inferior performance. On the other hand, a critical issue in applying the MFNs is the need to predetermine an appropriate network size for the problem being solved. A network-growth approach is pursued to address the problems concurrently and a progressive-training algorithm is proposed. The algorithm is guaranteed to converge to a finite-size network with a global-minimum solution. This holds true for any real-to-real mapping task. The algorithm's effectiveness is growing reasonably large. Globally minimized networks with superior generalization performances is demonstrated through three representative data sets.
| Original language | English |
|---|---|
| Pages (from-to) | 486-492 |
| Number of pages | 7 |
| Journal | IEE Proceedings: Control Theory and Applications |
| Volume | 142 |
| Issue number | 5 |
| DOIs | |
| Publication status | Published - 1 Sept 1995 |
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering
- Instrumentation