Stress is a fact of daily life. Stress can also deteriorate human's attention and memory, which, when a user is engaged in interactive applications, will negatively affect the user experience and downgrade the delivered performance. Traditional stress inference is mainly based on user physical features like Blood Volume Pulse, Galvanic Skin Response, often captured via devices that intrude on the user space. In contrast, this paper proposes a non-intrusive approach that exploits the consistency of users' behavioral patterns when interacting with a user interface, specifically, in terms of eye gaze and mouse movement. The relationship between the stress experienced by the user and his/her eye gaze and gaze-mouse coordination patterns are investigated. We show that both eye gaze and gaze-mouse coordination patterns can be exploited to distinguish whether a user is under stress. We also discover that a user's eye gaze behavior patterns are more consistent when he/she is under stress. This understanding of how a user's behavior differs under stress could be useful in the development of effective adaptive systems that can maximize user potential.
|Name||2017 7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017|
|Conference||7th International Conference on Affective Computing and Intelligent Interaction, ACII 2017|
|Period||23/10/17 → 26/10/17|
- Human-Computer Interaction
- Behavioral Neuroscience
- Social Psychology
- Artificial Intelligence