A fast learning method for feedforward neural networks

Shitong Wang, Fu Lai Korris Chung, Jun Wang, Jun Wu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

16 Citations (Scopus)

Abstract

In order to circumvent the weakness of very slow convergence of most traditional learning algorithms for single layer feedforward neural networks, the extreme learning machines (ELM) has been recently developed to achieve extremely fast learning with good performance by training only for the output weights. However, it cannot be applied to multiple-hidden layer feedforward neural networks (MLFN), which is a challenging bottleneck of ELM. In this work, the novel fast learning method (FLM) for feedforward neural networks is proposed. Firstly, based on the existing ridge regression theories, the hidden-feature-space ridge regression (HFSR) and centered ridge regression Centered-ELM are presented. Their connection with ELM is also theoretically revealed. As special kernel methods, they can inherently be used to propagate the prominent advantages of ELM into MLFN. Then, a novel fast learning method FLM for feedforward neural networks is proposed as a unified framework for HFSR and Centered-ELM. FLM can be applied for both SLFN and MLFN with a single or multiple outputs. In FLM, only the parameters in the last hidden layer require being adjusted while all the parameters in other hidden layers can be randomly assigned. The proposed FLM was tested against state of the art methods on real-world datasets and it provides better and more reliable results.
Original languageEnglish
Pages (from-to)295-307
Number of pages13
JournalNeurocomputing
Volume149
Issue numberPart A
DOIs
Publication statusPublished - 3 Feb 2015

Keywords

  • Extreme learning machine
  • Fast learning method
  • Feedforward neural network
  • Hidden-feature-space ridge regression

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this