An incremental cross-modal transfer learning method for gesture interaction

Junpei Zhong (Corresponding Author), Jie Li, Ahmad Lotfi, Peidong Liang, Chenguang Yang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

Abstract

Gesture can be used as an important way for human–robot interaction, since it is able to give accurate and intuitive instructions to the robots. Various sensors can be used to capture gestures. We apply three different sensors that can provide different modalities in recognizing human gestures. Such data also owns its own statistical properties for the purpose of transfer learning: they own the same labeled data, but both the source and the validation data-sets have their own statistical distributions. To tackle the transfer learning problem across different sensors with such kind of data-sets, we propose a weighting method to adjust the probability distributions of the data, which results in a more faster convergence result. We further apply this method in a broad learning system, which has proven to be efficient to learn with the incremental learning capability. The results show that although these three sensors measure different parts of the body using different technologies, transfer learning is able to find out the weighting correlation among the data-sets. It also suggests that using the proposed transfer learning is able to adjust the data which has different distributions which may be similar to the physical correlation between different parts of the body in the context of giving gestures.

Original languageEnglish
Article number104181
Pages (from-to)104181
Number of pages1
JournalRobotics and Autonomous Systems
Volume155
Early online date24 Jun 2022
DOIs
Publication statusE-pub ahead of print - 24 Jun 2022

Keywords

  • Depth camera
  • EMG
  • Gesture recognition
  • Leap Motion
  • Multi-modal
  • Transfer learning

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Mathematics(all)
  • Computer Science Applications

Cite this