TY - GEN
T1 - Robot teaching by teleoperation based on visual interaction and neural network learning
AU - Xu, Yang
AU - Yang, Chenguang
AU - Zhong, Junpei
AU - Ma, Hongbin
AU - Zhao, Lijun
AU - Wang, Min
N1 - Funding Information:
This work was partially supported by National Nature Science Foundation (NSFC) under Grant 61473120, Guangdong Provincial Natural Science Foundation 2014A030313266 and International Science and Technology Collaboration Grant 2015A050502017, Science and Technology Planning Project of Guangzhou 201607010006, State Key Laboratory of Robotics and System (HIT) Grant SKLRS-2017-KF-13, and the Fundamental Research Funds for the Central Universities.
Publisher Copyright:
© 2017 IEEE.
PY - 2018/3/21
Y1 - 2018/3/21
N2 - Traditional methods of Robot teaching require human demonstrators to program with a teaching pendant, which is a complex and time-consuming exercise. In this paper, we propose a novel method based on teleoperation which allows a demonstrator to train robot in an intuitive way. More specifically, at the beginning the demonstrator controls a robot by visual interaction. And then a learning algorithm based on radial basis function (RBF) network is used to transfer the demonstrator's motions to the robot. To verify the effectiveness of this developed methods, several simulation experiments have been carried out which based on Microsoft Kinect Sensor and the Virtual Robot Experimentation Platform (V-REP). The experimental results show that this method has achieved satisfactory performance. With the help of this method, the robot can not only complete the task autonomously after teaching, but also can learn the details of demonstrator's behavior.
AB - Traditional methods of Robot teaching require human demonstrators to program with a teaching pendant, which is a complex and time-consuming exercise. In this paper, we propose a novel method based on teleoperation which allows a demonstrator to train robot in an intuitive way. More specifically, at the beginning the demonstrator controls a robot by visual interaction. And then a learning algorithm based on radial basis function (RBF) network is used to transfer the demonstrator's motions to the robot. To verify the effectiveness of this developed methods, several simulation experiments have been carried out which based on Microsoft Kinect Sensor and the Virtual Robot Experimentation Platform (V-REP). The experimental results show that this method has achieved satisfactory performance. With the help of this method, the robot can not only complete the task autonomously after teaching, but also can learn the details of demonstrator's behavior.
KW - radial basis function (RBF) network
KW - Robot Teaching
KW - Teleoperation
UR - http://www.scopus.com/inward/record.url?scp=85050581765&partnerID=8YFLogxK
U2 - 10.1109/ICMIC.2017.8321615
DO - 10.1109/ICMIC.2017.8321615
M3 - Conference article published in proceeding or book
AN - SCOPUS:85050581765
T3 - Proceedings of 2017 9th International Conference On Modelling, Identification and Control, ICMIC 2017
SP - 1068
EP - 1073
BT - Proceedings of 2017 9th International Conference On Modelling, Identification and Control, ICMIC 2017
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th International Conference on Modelling, Identification and Control, ICMIC 2017
Y2 - 10 July 2017 through 12 July 2017
ER -