Introduction to a framework for multi-modal and tangible interaction

Kenneth W.K. Lo, Will W.W. Tang, Grace Ngai, Stephen C.F. Chan, Jason T.P. Tse

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

1 Citation (Scopus)

Abstract

This paper introduces the Multi-modal Interface Framework (MIF). It is a system which allows developers to easily integrate interface devices of multiple modalities, such as voice, hand and finger gestures, and various tangible devices such as game controllers into a multi-modal input system. The integrated devices can then be used to control practically any computer application. The advantages offered by MIF are ease of use, flexibility and support for collaboration. Its design has been validated by applying it to integrate finger gestures, voice, a Wii mote and an iPhone to control applications such as Google Earth and Windows Media Player.
Original languageEnglish
Title of host publication2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010
Pages3001-3007
Number of pages7
DOIs
Publication statusPublished - 1 Dec 2010
Event2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010 - Istanbul, Turkey
Duration: 10 Oct 201013 Oct 2010

Conference

Conference2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010
Country/TerritoryTurkey
CityIstanbul
Period10/10/1013/10/10

Keywords

  • Multi-modal interface
  • Tangible interface
  • User interface framework

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Introduction to a framework for multi-modal and tangible interaction'. Together they form a unique fingerprint.

Cite this