Robot teaching by teleoperation based on visual interaction and extreme learning machine

Yang Xu, Chenguang Yang, Junpei Zhong, Ning Wang, Lijun Zhao

Research output: Journal article publicationJournal articleAcademic researchpeer-review

43 Citations (Scopus)

Abstract

Compared with traditional robot teaching methods, robots can learn various human-like skills in a more efficient and natural manner through teleoperation. In this paper, we propose a teleoperation method based on human-robot interaction (HRI), which mainly uses visual information. With only one teleoperation, the robot can reproduce a trajectory. There is a certain error between this trajectory and the optimal trajectory due to the cause of the human demonstrator or the robot. So we use an extreme learning machine (ELM) based algorithm to transfer the demonstrator's motions to the robot. To verify the method, we use a Microsoft KinectV2 to capture the human body motion and the hand state, according to which a Baxter robot in Virtual Robot Experimentation Platform (V-REP) will be controlled by the command. Through learning and training by the ELM, the robot in V-REP can complete a certain task autonomously and the robot in reality can reproduce this trajectory well. The experimental results show that the developed method has achieved satisfactory performance.

Original languageEnglish
Pages (from-to)2093-2103
Number of pages11
JournalNeurocomputing
Volume275
DOIs
Publication statusPublished - 31 Jan 2018
Externally publishedYes

Keywords

  • Extreme learning machine (ELM)
  • Human-robot interaction (HRI)
  • Robot teaching
  • Teleoperation

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Robot teaching by teleoperation based on visual interaction and extreme learning machine'. Together they form a unique fingerprint.

Cite this