Minimum description length neural networks for time series prediction

Michael Small, Chi Kong Tse

Research output: Journal article publicationJournal articleAcademic researchpeer-review

75 Citations (Scopus)

Abstract

Artificial neural networks (ANN) are typically composed of a large number of nonlinear functions (neurons) each with several linear and nonlinear parameters that are fitted to data through a computationally intensive training process. Longer training results in a closer fit to the data, but excessive training will lead to overfitting. We propose an alternative scheme that has previously been described for radial basis functions (RBF). We show that fundamental differences between ANN and RBF make application of this scheme to ANN nontrivial. Under this scheme, the training process is replaced by an optimal fitting routine, and overfitting is avoided by controlling the number of neurons in the network. We show that for time series modeling and prediction, this procedure leads to small models (few neurons) that mimic the underlying dynamics of the system well and do not overfit the data. We apply this algorithm to several computational and real systems including chaotic differential equations, the annual sunspot count, and experimental data obtained from a chaotic laser. Our experiments indicate that the structural differences between ANN and RBF make ANN particularly well suited to modeling chaotic time series data.
Original languageEnglish
Pages (from-to)12
Number of pages1
JournalPhysical Review E - Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics
Volume66
Issue number6
DOIs
Publication statusPublished - 6 Dec 2002

ASJC Scopus subject areas

  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Condensed Matter Physics

Fingerprint

Dive into the research topics of 'Minimum description length neural networks for time series prediction'. Together they form a unique fingerprint.

Cite this