TY - JOUR
T1 - Toward Proactive Human–Robot Collaborative Assembly: A Multimodal Transfer-Learning-Enabled Action Prediction Approach
AU - Li, Shufei
AU - Zheng, Pai
AU - Fan, Junming
AU - Wang, Lihui
N1 - Funding information:
This research work was partially funded by the Laboratory for Artificial Intelligence in Design (Project Code: RP2-1), Hong Kong Special Administrative
Region, and Research Committee of The Hong Kong Polytechnic University under Departmental General Research Fund (G-UAHH). This research project has also been approved by the Human Subjects Ethics Sub-committee (HSESC), at the Hong Kong Polytechnic University (No. HSEARS20201110002).
Publisher Copyright:
IEEE
PY - 2022/8/1
Y1 - 2022/8/1
N2 - Human-robot collaborative assembly (HRCA) is vital for achieving high-level flexible automation for mass personalization in today's smart factories. However, existing works in both industry and academia mainly focus on the adaptive robot planning, while seldom consider human operator's intentions in advance. Hence, it hinders the HRCA transition toward a proactive manner. To overcome the bottleneck, this article proposes a multimodal transfer-learning-enabled action prediction approach, serving as the prerequisite to ensure the proactive HRCA. First, a multimodal intelligence-based action recognition approach is proposed to predict ongoing human actions by leveraging the visual stream and skeleton stream with short-time input frames. Second, a transfer-learning-enabled model is adapted to transfer learnt knowledge from daily activities to industrial assembly operations rapidly for online operator intention analysis. Third, a dynamic decision-making mechanism, including robotic decision and motion control, is described to allow mobile robots to assist operators in a proactive manner. Finally, an aircraft bracket assembly task is demonstrated in the laboratory environment, and the comparative study result shows that the proposed approach outperforms other state-of-the-art ones for efficient action prediction.
AB - Human-robot collaborative assembly (HRCA) is vital for achieving high-level flexible automation for mass personalization in today's smart factories. However, existing works in both industry and academia mainly focus on the adaptive robot planning, while seldom consider human operator's intentions in advance. Hence, it hinders the HRCA transition toward a proactive manner. To overcome the bottleneck, this article proposes a multimodal transfer-learning-enabled action prediction approach, serving as the prerequisite to ensure the proactive HRCA. First, a multimodal intelligence-based action recognition approach is proposed to predict ongoing human actions by leveraging the visual stream and skeleton stream with short-time input frames. Second, a transfer-learning-enabled model is adapted to transfer learnt knowledge from daily activities to industrial assembly operations rapidly for online operator intention analysis. Third, a dynamic decision-making mechanism, including robotic decision and motion control, is described to allow mobile robots to assist operators in a proactive manner. Finally, an aircraft bracket assembly task is demonstrated in the laboratory environment, and the comparative study result shows that the proposed approach outperforms other state-of-the-art ones for efficient action prediction.
KW - Action recognition
KW - human-robot collaboration
KW - multimodal intelligence
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85114652119&partnerID=8YFLogxK
U2 - 10.1109/TIE.2021.3105977
DO - 10.1109/TIE.2021.3105977
M3 - Journal article
AN - SCOPUS:85114652119
SN - 0278-0046
VL - 69
SP - 8579
EP - 8588
JO - IEEE Transactions on Industrial Electronics
JF - IEEE Transactions on Industrial Electronics
IS - 8
ER -