TY - JOUR
T1 - MyoKey: Inertial Motion Sensing and Gesture-Based QWERTY Keyboard for Extended Realities
AU - Shatilov, Kirill A.
AU - Kwon, Young D.
AU - Lee, Lik Hang
AU - Chatzopoulos, Dimitris
AU - Hui, Pan
N1 - Funding Information:
This work was supported in part by the 5GEAR Project under Grant 319669 and in part by the FIT Project under Grant 325570 from the Academy of Finland.
Publisher Copyright:
© 2002-2012 IEEE.
PY - 2023/8/1
Y1 - 2023/8/1
N2 - Usability challenges and social acceptance of textual input in a context of extended realities (XR) motivate the research of novel input modalities. We investigate the fusion of inertial measurement unit (IMU) control and surface electromyography (sEMG) gesture recognition applied to text entry using a QWERTY-layout virtual keyboard. We design, implement, and evaluate the proposed multi-modal solution named MyoKey. The user can select characters with a combination of arm movements and hand gestures. MyoKey employs a lightweight convolutional neural network classifier that can be deployed on a mobile device with insignificant inference time. We demonstrate the practicality of interruption-free text entry with MyoKey, by recruiting 12 participants and by testing three sets of grasp micro-gestures in three scenarios: empty hand text input, tripod grasp (e.g., pen), and a cylindrical grasp (e.g., umbrella). With MyoKey, users achieve an average text entry rate of 9.33 words per minute (WPM), 8.76 WPM, and 8.35 WPM for the freehand, tripod grasp, and cylindrical grasp conditions, respectively.
AB - Usability challenges and social acceptance of textual input in a context of extended realities (XR) motivate the research of novel input modalities. We investigate the fusion of inertial measurement unit (IMU) control and surface electromyography (sEMG) gesture recognition applied to text entry using a QWERTY-layout virtual keyboard. We design, implement, and evaluate the proposed multi-modal solution named MyoKey. The user can select characters with a combination of arm movements and hand gestures. MyoKey employs a lightweight convolutional neural network classifier that can be deployed on a mobile device with insignificant inference time. We demonstrate the practicality of interruption-free text entry with MyoKey, by recruiting 12 participants and by testing three sets of grasp micro-gestures in three scenarios: empty hand text input, tripod grasp (e.g., pen), and a cylindrical grasp (e.g., umbrella). With MyoKey, users achieve an average text entry rate of 9.33 words per minute (WPM), 8.76 WPM, and 8.35 WPM for the freehand, tripod grasp, and cylindrical grasp conditions, respectively.
KW - electromyography
KW - EMG
KW - micro-gestures
KW - mobile input techniques
UR - http://www.scopus.com/inward/record.url?scp=85126671809&partnerID=8YFLogxK
U2 - 10.1109/TMC.2022.3156939
DO - 10.1109/TMC.2022.3156939
M3 - Journal article
AN - SCOPUS:85126671809
SN - 1536-1233
VL - 22
SP - 4807
EP - 4821
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
IS - 8
ER -