Minimizing nearest neighbor classification error for nonparametric dimension reduction

Wei Bian, Tianyi Zhou, Aleix M. Martinez, George Baciu, Dacheng Tao

Research output: Journal article publicationJournal articleAcademic researchpeer-review

5 Citations (Scopus)

Abstract

In this brief, we show that minimizing nearest neighbor classification error (MNNE) is a favorable criterion for supervised linear dimension reduction (SLDR). We prove that MNNE is better than maximizing mutual information in the sense of being a proxy of the Bayes optimal criterion. Based on kernel density estimation, we derive a nonparametric algorithm for MNNE. Experiments on benchmark data sets show the superiority of MNNE over existing nonparametric SLDR methods.
Original languageEnglish
Article number6698335
Pages (from-to)1588-1594
Number of pages7
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume25
Issue number8
DOIs
Publication statusPublished - 1 Jan 2014

Keywords

  • Bayes optimal criterion
  • nearest neighbor classification error (NN error)
  • nonparametric methods
  • supervised linear-dimension reduction (SLDR)

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Minimizing nearest neighbor classification error for nonparametric dimension reduction'. Together they form a unique fingerprint.

Cite this