Abstract
With the increasing deployment of computers in a wide variety of applications, the ability to detect the user's attention, or engagement, is becoming more important as a key piece of contextual information in building effective interactive systems. For instance, one can imagine that a system that is aware of whether the user is attending to it would be able to adapt itself better to the user activities to enhance productivity. The ability to detect attention would also be useful for system analysis in designing and building better systems. However, much previous work in attention detection is either obtrusive or imposes demanding constraints on the context and the participants. In addition, most approaches rely on uni-modal signals, which are often limited in availability and stability. This paper attempts to address these two major limitations through a noninvasive multimodal solution, which allows participants to work naturally without interference. The solution makes use of common off-the-shelf items that could reasonably be expected of any computing environment and does not rely on expensive and tailor-made equipment. Using a three-class attention state setting, it achieves average accuracy rates of 59.63% to 77.81%; the best result being 77.81% for a general searching task, which shows 11.9% improvement over the baseline. We also analyze and discuss the contribution by individual features to different models.
Original language | English |
---|---|
Title of host publication | ACHI 2014 - 7th International Conference on Advances in Computer-Human Interactions |
Publisher | International Academy, Research and Industry Association, IARIA |
Pages | 192-199 |
Number of pages | 8 |
ISBN (Electronic) | 9781612083254 |
Publication status | Published - 1 Jan 2014 |
Event | 7th International Conference on Advances in Computer-Human Interactions, ACHI 2014 - Barcelona, Spain Duration: 23 Mar 2014 → 27 Mar 2014 |
Conference
Conference | 7th International Conference on Advances in Computer-Human Interactions, ACHI 2014 |
---|---|
Country/Territory | Spain |
City | Barcelona |
Period | 23/03/14 → 27/03/14 |
Keywords
- Affective computing
- Attention detection
- Facial expression
- Keystroke dynamics
- Multimodal recognition
ASJC Scopus subject areas
- Human-Computer Interaction