Abstract
In this paper, we propose a feature-fusion method based on Canonical Correlation Analysis (CCA) for facial-expression recognition. In our proposed method, features from the eye and the mouth windows are extracted separately, which are correlated with each other in representing a facial expression. For each of the windows, two effective features, namely the Local Phase Quantization (LPQ) and the Pyramid of Histogram of Oriented Gradients (PHOG) descriptors, are employed to form low-level representations of the corresponding windows. The features are then represented in a coherent subspace by using CCA in order to maximize the correlation. In our experiments, the Extended Cohn-Kanade dataset is used; its face images span seven different emotions, namely anger, contempt, disgust, fear, happiness, sadness, and surprise. Experiment results show that our method can achieve excellent accuracy for facial-expression recognition.
Original language | English |
---|---|
Title of host publication | 2014 IEEE International Conference on Image Processing, ICIP 2014 |
Publisher | IEEE |
Pages | 5966-5970 |
Number of pages | 5 |
ISBN (Electronic) | 9781479957514 |
DOIs | |
Publication status | Published - 28 Jan 2014 |
Keywords
- Canonical Correlation Analysis
- facial expression recognition
- feature fusion
- Local Phase Quantization
- Pyramid of Histogram of Oriented Gradients
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition