TY - CHAP
T1 - Embodied Interaction on Constrained Interfaces for Augmented Reality
AU - Lee, Lik Hang
AU - Braud, Tristan
AU - Hui, Pan
N1 - Publisher Copyright:
© 2023, Springer Nature Switzerland AG.
PY - 2023/1
Y1 - 2023/1
N2 - Wearable computers have seen a recent resurgence in interest and popularity in which augmented reality (AR) smartglasses are poised to influence the way we complete our work and daily tasks. Nowadays, industrial applications of these smartglasses are focused on interior designs, remote collaborations, as well as e-commerce. Under five key constraints on AR smartglasses such as miniature touch interface, small-screen real estate, user mobility, limited computational resource, and short battery life, existing user interaction paradigms designed for desktop computers and smartphones are obsolete and incompatible with the scenarios of AR smartglasses. The cumbersome and difficult interaction with the AR smartglasses becomes a hurdle to their wider industrial applications. Thus, an unmet demand for designing interaction techniques on AR smartglasses is undoubtedly critical. In this chapter, we present three interaction techniques, namely, TiPoint, HIBEY, and TOFI, in order to enhance object manipulation and text entry in the constrained environment of AR smartglasses. These techniques are devised in a way that leverage on advantageous features of human body and experiences such as the dexterity of fingertip, lexicographical order ingrained in our memory, proprioception, as well as opposable thumbs. We thoroughly address the key constraints on AR smartglasses and explore different modalities with various hardware and peripherals.
AB - Wearable computers have seen a recent resurgence in interest and popularity in which augmented reality (AR) smartglasses are poised to influence the way we complete our work and daily tasks. Nowadays, industrial applications of these smartglasses are focused on interior designs, remote collaborations, as well as e-commerce. Under five key constraints on AR smartglasses such as miniature touch interface, small-screen real estate, user mobility, limited computational resource, and short battery life, existing user interaction paradigms designed for desktop computers and smartphones are obsolete and incompatible with the scenarios of AR smartglasses. The cumbersome and difficult interaction with the AR smartglasses becomes a hurdle to their wider industrial applications. Thus, an unmet demand for designing interaction techniques on AR smartglasses is undoubtedly critical. In this chapter, we present three interaction techniques, namely, TiPoint, HIBEY, and TOFI, in order to enhance object manipulation and text entry in the constrained environment of AR smartglasses. These techniques are devised in a way that leverage on advantageous features of human body and experiences such as the dexterity of fingertip, lexicographical order ingrained in our memory, proprioception, as well as opposable thumbs. We thoroughly address the key constraints on AR smartglasses and explore different modalities with various hardware and peripherals.
KW - Augmented reality
KW - Constrained interfaces
KW - Embodied interaction
KW - Human-computer interaction
KW - Interface design
KW - Keyboard-less text entry
KW - Mid-air interaction
KW - Target acquisition
KW - Thumb-to-finger space interaction
KW - Wearable computing
UR - http://www.scopus.com/inward/record.url?scp=85145831426&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-67822-7_10
DO - 10.1007/978-3-030-67822-7_10
M3 - Chapter in an edited book (as author)
AN - SCOPUS:85145831426
T3 - Springer Handbooks
SP - 239
EP - 271
BT - Springer Handbooks
PB - Springer Science and Business Media Deutschland GmbH
ER -