MyoKey: Inertial Motion Sensing and Gesture-Based QWERTY Keyboard for Extended Realities

Kirill A. Shatilov, Young D. Kwon, Lik Hang Lee, Dimitris Chatzopoulos, Pan Hui

Research output: Journal article publicationJournal articleAcademic researchpeer-review

2 Citations (Scopus)


Usability challenges and social acceptance of textual input in a context of extended realities (XR) motivate the research of novel input modalities. We investigate the fusion of inertial measurement unit (IMU) control and surface electromyography (sEMG) gesture recognition applied to text entry using a QWERTY-layout virtual keyboard. We design, implement, and evaluate the proposed multi-modal solution named MyoKey. The user can select characters with a combination of arm movements and hand gestures. MyoKey employs a lightweight convolutional neural network classifier that can be deployed on a mobile device with insignificant inference time. We demonstrate the practicality of interruption-free text entry with MyoKey, by recruiting 12 participants and by testing three sets of grasp micro-gestures in three scenarios: empty hand text input, tripod grasp (e.g., pen), and a cylindrical grasp (e.g., umbrella). With MyoKey, users achieve an average text entry rate of 9.33 words per minute (WPM), 8.76 WPM, and 8.35 WPM for the freehand, tripod grasp, and cylindrical grasp conditions, respectively.

Original languageEnglish
Pages (from-to)4807-4821
Number of pages15
JournalIEEE Transactions on Mobile Computing
Issue number8
Publication statusPublished - 1 Aug 2023
Externally publishedYes


  • electromyography
  • EMG
  • micro-gestures
  • mobile input techniques

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Electrical and Electronic Engineering


Dive into the research topics of 'MyoKey: Inertial Motion Sensing and Gesture-Based QWERTY Keyboard for Extended Realities'. Together they form a unique fingerprint.

Cite this