Elastic shape-texture matching for human face recognition

Xudong Xie, Kin Man Lam

Research output: Journal article publicationJournal articleAcademic researchpeer-review

23 Citations (Scopus)

Abstract

In this paper, a novel, elastic, shape-texture matching method, namely ESTM, for human face recognition is proposed. In our approach, both the shape and the texture information are used to compare two faces without establishing any precise pixel-wise correspondence. The edge map is used to represent the shape of an image, while the texture information is characterized by both the Gabor representations and the gradient direction of each pixel. Combining these features, a shape-texture Hausdorff distance is devised to compute the similarity of two face images. The elastic matching is robust to small, local distortions of the feature points such as those caused by facial expression variations. In addition, the use of the edge map, Gabor representations and the direction of the image gradient can all alleviate the effect of illumination to a certain extent. With different databases, experimental results show that our algorithm can always achieve a better performance than other face recognition algorithms under different conditions, except when an image is under poor and uneven illumination. Experiments based on the Yale database, AR database, ORL database and YaleB database show that our proposed method can achieve recognition rates of 88.7%, 97.7%, 78.3% and 89.5%, respectively.
Original languageEnglish
Pages (from-to)396-405
Number of pages10
JournalPattern Recognition
Volume41
Issue number1
DOIs
Publication statusPublished - 1 Jan 2008

Keywords

  • Elastic shape-texture matching
  • Face recognition
  • Gabor wavelets
  • Hausdorff distance

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Elastic shape-texture matching for human face recognition'. Together they form a unique fingerprint.

Cite this