Least learning machine and its experimental studies on regression capability

Shitong Wang, Fu Lai Korris Chung, Jun Wu, Jun Wang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

16 Citations (Scopus)


Feedforward neural networks have been extensively used to approximate complex nonlinear mappings directly from the input samples. However, their traditional learning algorithms are usually much slower than required. In this work, two hidden-feature-space ridge regression methods HFSR and centered-ELM are first proposed for feedforward networks. As the special kernel methods, the important characteristics of both HFSR and centered-ELM are that rigorous Mercer's condition for kernel functions is not required and that they can inherently be used to propagate the prominent advantages of ELM into MLFN. Except for randomly assigned weights adopted in both ELM and HFSR, HFSR also exploits another randomness, i.e., randomly selected examplars from the training set for kernel activation functions. Through forward layer-by-layer data transformation, we can extend HFSR and Centered-ELM to MLFN. Accordingly, as the unified framework for HFSR and Centered-ELM, the least learning machine (LLM) is proposed for both SLFN and MLFN with a single or multiple outputs. LLM actually gives a new learning method for MLFN with keeping the same virtues of ELM only for SLFN, i.e., only the parameters in the last hidden layer require being adjusted, all the parameters in other hidden layers can be randomly assigned, and LLM is also much faster than BP for MLFN in training the sample sets. The experimental results clearly indicate the power of LLM on its application in nonlinear regression modeling.
Original languageEnglish
Pages (from-to)677-684
Number of pages8
JournalApplied Soft Computing Journal
Publication statusPublished - 1 Jan 2014


  • Extreme learning machine
  • Feedforward neural network
  • Hidden-feature-space ridge regression
  • Least learning machine

ASJC Scopus subject areas

  • Software

Cite this