Abstract
Full rank principal component analysis (FR-PCA) is a special form of principal component analysis (PCA) which retains all nonzero components of PCA. Generally speaking, it is hard to estimate how the accuracy of a classifier will change after data are compressed by PCA. However, this paper reveals an interesting fact that the transformation by FR-PCA does not change the accuracy of many well-known classification algorithms. It predicates that people can safely use FR-PCA as a preprocessing tool to compress high-dimensional data without deteriorating the accuracies of these classifiers. The main contribution of the paper is that it theoretically proves that the transformation by FR-PCA does not change accuracies of the k nearest neighbor, the minimum distance, support vector machine, large margin linear projection, and maximum scatter difference classifiers. In addition, through extensive experimental studies conducted on several benchmark face image databases, this paper demonstrates that FR-PCA can greatly promote the efficiencies of above-mentioned five classification algorithms in appearance-based face recognition.
Original language | English |
---|---|
Article number | 1256005 |
Journal | International Journal of Pattern Recognition and Artificial Intelligence |
Volume | 26 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 May 2012 |
Keywords
- dimension reduction
- face recognition
- Pattern classification
- principal component analysis
ASJC Scopus subject areas
- Software
- Computer Vision and Pattern Recognition
- Artificial Intelligence