Abstract
In this brief, we show that minimizing nearest neighbor classification error (MNNE) is a favorable criterion for supervised linear dimension reduction (SLDR). We prove that MNNE is better than maximizing mutual information in the sense of being a proxy of the Bayes optimal criterion. Based on kernel density estimation, we derive a nonparametric algorithm for MNNE. Experiments on benchmark data sets show the superiority of MNNE over existing nonparametric SLDR methods.
Original language | English |
---|---|
Article number | 6698335 |
Pages (from-to) | 1588-1594 |
Number of pages | 7 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 25 |
Issue number | 8 |
DOIs | |
Publication status | Published - 1 Jan 2014 |
Keywords
- Bayes optimal criterion
- nearest neighbor classification error (NN error)
- nonparametric methods
- supervised linear-dimension reduction (SLDR)
ASJC Scopus subject areas
- Software
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence