Abstract
We propose a linear discriminant analysis method. In this method, every discriminant vector, except for the first one, is worked out by maximizing a Fisher criterion defined in a transformed space which is the null space of the previously obtained discriminant vectors. All of these discriminant vectors are used for dimension reduction. We also propose two algorithms to implement the model. Based on the algorithms, we prove that the discriminant vectors will be orthogonal if the within-class scatter matrix is not singular. The experimental results show that the proposed method is effective and efficient.
Original language | English |
---|---|
Pages (from-to) | 2168-2176 |
Number of pages | 9 |
Journal | Neurocomputing |
Volume | 73 |
Issue number | 10-12 |
DOIs | |
Publication status | Published - 1 Jun 2010 |
Keywords
- Dimension reduction
- Fisher discriminant analysis
- Orthogonal discriminant vectors
- Pattern recognition
ASJC Scopus subject areas
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence