Minimizing nearest neighbor classification error for nonparametric dimension reduction

Wei Bian, Tianyi Zhou, Aleix M. Martinez, George Baciu, Dacheng Tao

Research output: Journal article publicationJournal articleAcademic researchpeer-review

5 Citations (Scopus)


In this brief, we show that minimizing nearest neighbor classification error (MNNE) is a favorable criterion for supervised linear dimension reduction (SLDR). We prove that MNNE is better than maximizing mutual information in the sense of being a proxy of the Bayes optimal criterion. Based on kernel density estimation, we derive a nonparametric algorithm for MNNE. Experiments on benchmark data sets show the superiority of MNNE over existing nonparametric SLDR methods.
Original languageEnglish
Article number6698335
Pages (from-to)1588-1594
Number of pages7
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number8
Publication statusPublished - 1 Jan 2014


  • Bayes optimal criterion
  • nearest neighbor classification error (NN error)
  • nonparametric methods
  • supervised linear-dimension reduction (SLDR)

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Minimizing nearest neighbor classification error for nonparametric dimension reduction'. Together they form a unique fingerprint.

Cite this