Gaussian mixture models and probabilistic decision-based neural networks for pattern classification: A comparative study

K. K. Yiu, Man Wai Mak, C. K. Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

23 Citations (Scopus)

Abstract

Probabilistic Decision-Based Neural Networks (PDBNNs) can be considered as a special form of Gaussian Mixture Models (GMMs) with trainable decision thresholds. This paper provides detailed illustrations to compare the recognition accuracy and decision boundaries of PDBNNs with that of GMMs through two pattern recognition tasks, namely the noisy XOR problem and the classification of two-dimensional vowel data. The paper highlights the strengths of PDBNNs by demonstrating that their thresholding mechanism is very effective in detecting data not belonging to any known classes. The original PDBNNs use elliptical basis functions with diagonal covariance matrices, which may be inappropriate for modelling feature vectors with correlated components. This paper overcomes this limitation by using full covariance matrices, and showing that the matrices are effective in characterising non-spherical clusters.
Original languageEnglish
Pages (from-to)235-245
Number of pages11
JournalNeural Computing and Applications
Volume8
Issue number3
DOIs
Publication statusPublished - 1 Jan 1999

Keywords

  • EM algorithm
  • Gaussian mixture models
  • Pattern classification
  • Probabilistic decision-based neural networks

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Gaussian mixture models and probabilistic decision-based neural networks for pattern classification: A comparative study'. Together they form a unique fingerprint.

Cite this