TY - JOUR
T1 - Vision Transformers (ViT) for Blanket-Penetrating Sleep Posture Recognition Using a Triple Ultra-Wideband (UWB) Radar System
AU - Lai, Derek Ka Hei
AU - Yu, Zi Han
AU - Leung, Tommy Yau Nam
AU - Lim, Hyo Jung
AU - Tam, Andy Yiu Chau
AU - So, Bryan Pak Hei
AU - Mao, Ye Jiao
AU - Cheung, Daphne Sze Ki
AU - Wong, Duo Wai Chi
AU - Cheung, James Chung Wai
N1 - Funding Information:
The work was supported by the General Research Fund from the Research Grants Council of Hong Kong, China (reference number: PolyU15223822), as well as the internal fund from the Research Institute for Smart Ageing (reference number: P0039001) and the Department of Biomedical Engineering (reference number: P0033913 and P0035896) at Hong Kong Polytechnic University.
Publisher Copyright:
© 2023 by the authors.
PY - 2023/2/23
Y1 - 2023/2/23
N2 - Sleep posture has a crucial impact on the incidence and severity of obstructive sleep apnea (OSA). Therefore, the surveillance and recognition of sleep postures could facilitate the assessment of OSA. The existing contact-based systems might interfere with sleeping, while camera-based systems introduce privacy concerns. Radar-based systems might overcome these challenges, especially when individuals are covered with blankets. The aim of this research is to develop a nonobstructive multiple ultra-wideband radar sleep posture recognition system based on machine learning models. We evaluated three single-radar configurations (top, side, and head), three dual-radar configurations (top + side, top + head, and side + head), and one tri-radar configuration (top + side + head), in addition to machine learning models, including CNN-based networks (ResNet50, DenseNet121, and EfficientNetV2) and vision transformer-based networks (traditional vision transformer and Swin Transformer V2). Thirty participants (n = 30) were invited to perform four recumbent postures (supine, left side-lying, right side-lying, and prone). Data from eighteen participants were randomly chosen for model training, another six participants’ data (n = 6) for model validation, and the remaining six participants’ data (n = 6) for model testing. The Swin Transformer with side and head radar configuration achieved the highest prediction accuracy (0.808). Future research may consider the application of the synthetic aperture radar technique.
AB - Sleep posture has a crucial impact on the incidence and severity of obstructive sleep apnea (OSA). Therefore, the surveillance and recognition of sleep postures could facilitate the assessment of OSA. The existing contact-based systems might interfere with sleeping, while camera-based systems introduce privacy concerns. Radar-based systems might overcome these challenges, especially when individuals are covered with blankets. The aim of this research is to develop a nonobstructive multiple ultra-wideband radar sleep posture recognition system based on machine learning models. We evaluated three single-radar configurations (top, side, and head), three dual-radar configurations (top + side, top + head, and side + head), and one tri-radar configuration (top + side + head), in addition to machine learning models, including CNN-based networks (ResNet50, DenseNet121, and EfficientNetV2) and vision transformer-based networks (traditional vision transformer and Swin Transformer V2). Thirty participants (n = 30) were invited to perform four recumbent postures (supine, left side-lying, right side-lying, and prone). Data from eighteen participants were randomly chosen for model training, another six participants’ data (n = 6) for model validation, and the remaining six participants’ data (n = 6) for model testing. The Swin Transformer with side and head radar configuration achieved the highest prediction accuracy (0.808). Future research may consider the application of the synthetic aperture radar technique.
KW - ablation study
KW - deep learning
KW - feature extraction
KW - obstructive sleep apnea
KW - sleep monitoring
UR - http://www.scopus.com/inward/record.url?scp=85149790126&partnerID=8YFLogxK
U2 - 10.3390/s23052475
DO - 10.3390/s23052475
M3 - Journal article
C2 - 36904678
AN - SCOPUS:85149790126
SN - 1424-8220
VL - 23
JO - Sensors
JF - Sensors
IS - 5
M1 - 2475
ER -