Locating the human eye using fractal dimensions

K. H. Lin, Kin Man Lam, W. C. Siu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

2 Citations (Scopus)


In this paper, a new method for locating eye pairs based on valley field detection and measurement of fractal dimensions is proposed. Fractal dimension is an efficient representation of the texture of facial features. Possible eye candidates in an image with a complex background are identified by valley field detection. The eye candidates are then grouped to form eye pairs if their local properties for eyes are satisfied. Two eyes are matched if they have similar roughness and orientation as represented by fractal dimensions. We propose a modified approach to estimate the fractal dimensions that are less sensitive to lighting conditions and provide information about the orientation of an image under consideration. Possible eye pairs are further verified by comparing the fractal dimensions of the eye-pair window and the corresponding face region with the respective means of the fractal dimensions of the eye-pair windows and the face regions. The means of the fractal dimensions are obtained based on a number of facial images in a database. Experiments have shown that this approach is fast and reliable. This shows that the texture of the eyes can be represented very well by fractal surfaces.
Original languageEnglish
Title of host publicationIEEE International Conference on Image Processing
Number of pages4
Publication statusPublished - 1 Jan 2001
EventIEEE International Conference on Image Processing (ICIP) - Thessaloniki, Greece
Duration: 7 Oct 200110 Oct 2001


ConferenceIEEE International Conference on Image Processing (ICIP)

ASJC Scopus subject areas

  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering


Dive into the research topics of 'Locating the human eye using fractal dimensions'. Together they form a unique fingerprint.

Cite this