A class of competitive learning models which avoids neuron underutilization problem

Sze Tsan Choy, Wan Chi Siu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

10 Citations (Scopus)

Abstract

In this paper, we study a qualitative property of a class of competitive learning (CL) models, which is called the multiplicatively biased competitive learning (MBCL) model, namely that it avoids neuron underutilization with probability one as time goes to infinity. In the MBCL, the competition among neurons is biased by a multiplicative term, while only one weight vector is updated per learning step. This is of practical interest since its instances have computational complexities among the lowest in existing CL models. In addition, in applications like classification, vector quantizer design and probability density function estimation, a necessary condition for optimal performance is to avoid neuron underutilization. Hence, it is possible to define instances of MBCL to achieve optimal performance in these applications.
Original languageEnglish
Pages (from-to)1258-1269
Number of pages12
JournalIEEE Transactions on Neural Networks
Volume9
Issue number6
DOIs
Publication statusPublished - 1 Dec 1998

Keywords

  • Multiplicatively biased competitive learning
  • Neuron underutilization problem
  • Vector quantization

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'A class of competitive learning models which avoids neuron underutilization problem'. Together they form a unique fingerprint.

Cite this