Multilayer feedforward networks (MFNs) are a class of ANN model that has been widely used in the field of pattern classification. Training in the model is usually accomplished by the well known backpropagation algorithm. This is a steepest-descent-optimization that can get 'stuck' in local minima and which results in inferior performance. On the other hand, a critical issue in applying the MFNs is the need to predetermine an appropriate network size for the problem being solved. A network-growth approach is pursued to address the problems concurrently and a progressive-training algorithm is proposed. The algorithm is guaranteed to converge to a finite-size network with a global-minimum solution. This holds true for any real-to-real mapping task. The algorithm's effectiveness is growing reasonably large. Globally minimized networks with superior generalization performances is demonstrated through three representative data sets.
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering