TY - JOUR
T1 - Unleashing mixed-reality capability in Deep Reinforcement Learning-based robot motion generation towards safe human–robot collaboration
AU - Li, Chengxi
AU - Zheng, Pai
AU - Zhou, Peng
AU - Yin, Yue
AU - Lee, Carman K.M.
AU - Wang, Lihui
N1 - Funding information:
This research work was partially supported by the grants from the General Research Fund (GRF) from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. PolyU15210222 and PolyU15206723), the Laboratory for Artificial Intelligence in Design (Project Code: RP2-1), Innovation and Technology Fund, Hong Kong Special Administrative Region, and the Research Committee of The Hong Kong Polytechnic University under Collaborative Departmental General Research Fund (G-UAMS), Hong Kong Special Administrative Region.
Publisher Copyright:
© 2024
PY - 2024/6
Y1 - 2024/6
N2 - The integration of human–robot collaboration yields substantial benefits, particularly in terms of enhancing flexibility and efficiency within a range of mass-personalized manufacturing tasks, for example, small-batch customized product inspection and assembly/disassembly. Meanwhile, as human–robot collaboration lands broader in manufacturing, the unstructured scene and operator uncertainties are increasingly involved and considered. Consequently, it becomes imperative for robots to execute in a safe and adaptive manner rather than solely relying on pre-programmed instructions. To tackle it, a systematic solution for safe robot motion generation in human–robot collaborative activities is proposed, leveraging mixed-reality technologies and Deep Reinforcement Learning. This solution covers the entire process of collaboration starting with an intuitive interface that facilitates bare-hand task command transmission and scene coordinate transformation before the collaboration begins. In particular, mixed-reality devices are employed as effective tools for representing the state of humans, robots, and scenes. This enables the learning of an end-to-end Deep Reinforcement Learning policy that addresses both the uncertainties in robot perception and decision-making in an integrated manner. The proposed solution also implements policy simulation-to-reality deployment, along with motion preview and collision detection mechanisms, to ensure safe robot motion execution. It is hoped that this work could inspire further research in human–robot collaboration to unleash and exploit the powerful capabilities of mixed reality.
AB - The integration of human–robot collaboration yields substantial benefits, particularly in terms of enhancing flexibility and efficiency within a range of mass-personalized manufacturing tasks, for example, small-batch customized product inspection and assembly/disassembly. Meanwhile, as human–robot collaboration lands broader in manufacturing, the unstructured scene and operator uncertainties are increasingly involved and considered. Consequently, it becomes imperative for robots to execute in a safe and adaptive manner rather than solely relying on pre-programmed instructions. To tackle it, a systematic solution for safe robot motion generation in human–robot collaborative activities is proposed, leveraging mixed-reality technologies and Deep Reinforcement Learning. This solution covers the entire process of collaboration starting with an intuitive interface that facilitates bare-hand task command transmission and scene coordinate transformation before the collaboration begins. In particular, mixed-reality devices are employed as effective tools for representing the state of humans, robots, and scenes. This enables the learning of an end-to-end Deep Reinforcement Learning policy that addresses both the uncertainties in robot perception and decision-making in an integrated manner. The proposed solution also implements policy simulation-to-reality deployment, along with motion preview and collision detection mechanisms, to ensure safe robot motion execution. It is hoped that this work could inspire further research in human–robot collaboration to unleash and exploit the powerful capabilities of mixed reality.
KW - Deep Reinforcement Learning
KW - Human–robot collaboration
KW - Manufacturing safety
KW - Mixed reality
KW - Smart manufacturing
UR - http://www.scopus.com/inward/record.url?scp=85190069993&partnerID=8YFLogxK
U2 - 10.1016/j.jmsy.2024.03.015
DO - 10.1016/j.jmsy.2024.03.015
M3 - Journal article
AN - SCOPUS:85190069993
SN - 0278-6125
VL - 74
SP - 411
EP - 421
JO - Journal of Manufacturing Systems
JF - Journal of Manufacturing Systems
ER -