Facial Position and Expression-Based Human-Computer Interface for Persons with Tetraplegia

Zhen Peng Bian, Junhui Hou, Lap Pui Chau, Nadia Magnenat-Thalmann

Research output: Journal article publicationJournal articleAcademic researchpeer-review

28 Citations (Scopus)


A human-computer interface (namely Facial position and expression Mouse system, FM) for the persons with tetraplegia based on a monocular infrared depth camera is presented in this paper. The nose position along with the mouth status (close/open) is detected by the proposed algorithm to control and navigate the cursor as computer user input. The algorithm is based on an improved Randomized Decision Tree, which is capable of detecting the facial information efficiently and accurately. A more comfortable user experience is achieved by mapping the nose motion to the cursor motion via a nonlinear function. The infrared depth camera enables the system to be independent of illumination and color changes both from the background and on human face, which is a critical advantage over RGB camera-based options. Extensive experimental results show that the proposed system outperforms existing assistive technologies in terms of quantitative and qualitative assessments.

Original languageEnglish
Article number7058374
Pages (from-to)915-924
Number of pages10
JournalIEEE Journal of Biomedical and Health Informatics
Issue number3
Publication statusPublished - May 2016
Externally publishedYes


  • assistive technology (AT)
  • Camera mouse
  • computer access.
  • Fitts' law
  • hand-free control
  • humancomputer interaction (HCI)
  • perceptual user interface
  • severe disabilities

ASJC Scopus subject areas

  • Biotechnology
  • Computer Science Applications
  • Electrical and Electronic Engineering
  • Health Information Management


Dive into the research topics of 'Facial Position and Expression-Based Human-Computer Interface for Persons with Tetraplegia'. Together they form a unique fingerprint.

Cite this