Abstract
This paper introduces the Multi-modal Interface Framework (MIF). It is a system which allows developers to easily integrate interface devices of multiple modalities, such as voice, hand and finger gestures, and various tangible devices such as game controllers into a multi-modal input system. The integrated devices can then be used to control practically any computer application. The advantages offered by MIF are ease of use, flexibility and support for collaboration. Its design has been validated by applying it to integrate finger gestures, voice, a Wii mote and an iPhone to control applications such as Google Earth and Windows Media Player.
Original language | English |
---|---|
Title of host publication | 2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010 |
Pages | 3001-3007 |
Number of pages | 7 |
DOIs | |
Publication status | Published - 1 Dec 2010 |
Event | 2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010 - Istanbul, Turkey Duration: 10 Oct 2010 → 13 Oct 2010 |
Conference
Conference | 2010 IEEE International Conference on Systems, Man and Cybernetics, SMC 2010 |
---|---|
Country/Territory | Turkey |
City | Istanbul |
Period | 10/10/10 → 13/10/10 |
Keywords
- Multi-modal interface
- Tangible interface
- User interface framework
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Control and Systems Engineering
- Human-Computer Interaction