Face recognition based on illumination restoration

Dang Hui Liu, Lan Sun Shen, Kin Man Lam, Xiao Kong

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Variations in lighting conditions make face recognition an even more challenging and difficult task. In this paper, a novel approach is proposed to handle the illumination problem. Our method can restore a face image captured under arbitrary lighting conditions to one with frontal illumination by using a ratio-image and an iterative algorithm. The restored images with frontal illumination are used for face recognition by means of PCA. Experimental results demonstrate that our method can achieve a higher recognition rate based on the Yale B and Yale database. Moreover, our algorithm has several advantages over other previous algorithms: (1) does not need to estimate the face surface normals and the light source directions, (2) it does not need many images captured under different lighting conditions for each person, nor a set of bootstrap images that includes many images with different illuminations, and (3) it does not need to detect accurate positions of some facial feature points and to warp image for alignment, etc.
Original languageEnglish
Title of host publication2004 International Symposium on Intelligent Multimedia, Video and Speech Processing, ISIMP 2004
Pages105-108
Number of pages4
Publication statusPublished - 1 Dec 2004
Event2004 International Symposium on Intelligent Multimedia, Video and Speech Processing, ISIMP 2004 - Hong Kong, China, Hong Kong
Duration: 20 Oct 200422 Oct 2004

Conference

Conference2004 International Symposium on Intelligent Multimedia, Video and Speech Processing, ISIMP 2004
Country/TerritoryHong Kong
CityHong Kong, China
Period20/10/0422/10/04

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Face recognition based on illumination restoration'. Together they form a unique fingerprint.

Cite this