A class of competitive learning models which avoids neuron underutilization problem

Sze Tsan Choy, Wan Chi Siu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

10 Citations (Scopus)


In this paper, we study a qualitative property of a class of competitive learning (CL) models, which is called the multiplicatively biased competitive learning (MBCL) model, namely that it avoids neuron underutilization with probability one as time goes to infinity. In the MBCL, the competition among neurons is biased by a multiplicative term, while only one weight vector is updated per learning step. This is of practical interest since its instances have computational complexities among the lowest in existing CL models. In addition, in applications like classification, vector quantizer design and probability density function estimation, a necessary condition for optimal performance is to avoid neuron underutilization. Hence, it is possible to define instances of MBCL to achieve optimal performance in these applications.
Original languageEnglish
Pages (from-to)1258-1269
Number of pages12
JournalIEEE Transactions on Neural Networks
Issue number6
Publication statusPublished - 1 Dec 1998


  • Multiplicatively biased competitive learning
  • Neuron underutilization problem
  • Vector quantization

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture


Dive into the research topics of 'A class of competitive learning models which avoids neuron underutilization problem'. Together they form a unique fingerprint.

Cite this