Abstract
Gesture can be used as an important way for human–robot interaction, since it is able to give accurate and intuitive instructions to the robots. Various sensors can be used to capture gestures. We apply three different sensors that can provide different modalities in recognizing human gestures. Such data also owns its own statistical properties for the purpose of transfer learning: they own the same labeled data, but both the source and the validation data-sets have their own statistical distributions. To tackle the transfer learning problem across different sensors with such kind of data-sets, we propose a weighting method to adjust the probability distributions of the data, which results in a more faster convergence result. We further apply this method in a broad learning system, which has proven to be efficient to learn with the incremental learning capability. The results show that although these three sensors measure different parts of the body using different technologies, transfer learning is able to find out the weighting correlation among the data-sets. It also suggests that using the proposed transfer learning is able to adjust the data which has different distributions which may be similar to the physical correlation between different parts of the body in the context of giving gestures.
Original language | English |
---|---|
Article number | 104181 |
Pages (from-to) | 104181 |
Number of pages | 1 |
Journal | Robotics and Autonomous Systems |
Volume | 155 |
Early online date | 24 Jun 2022 |
DOIs | |
Publication status | Published - Sept 2022 |
Keywords
- Depth camera
- EMG
- Gesture recognition
- Leap Motion
- Multi-modal
- Transfer learning
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Mathematics(all)
- Computer Science Applications