TY - JOUR
T1 - Skill Learning for Intelligent Robot by Perception-Action Integration
T2 - A View from Hierarchical Temporal Memory
AU - Zhang, Xinzheng
AU - Zhang, Jianfen
AU - Zhong, Junpei
N1 - Funding Information:
This work was supported by National Natural Science Foundation of China under Grant no. 61203338. The authors thank the NuPIC Open Source Project and all the contributors of the NuPIC codes.
Publisher Copyright:
© 2017 Xinzheng Zhang et al.
PY - 2017/11/15
Y1 - 2017/11/15
N2 - Skill learning autonomously through interactions with the environment is a crucial ability for intelligent robot. A perception-action integration or sensorimotor cycle, as an important issue in imitation learning, is a natural mechanism without the complex program process. Recently, neurocomputing model and developmental intelligence method are considered as a new trend for implementing the robot skill learning. In this paper, based on research of the human brain neocortex model, we present a skill learning method by perception-action integration strategy from the perspective of hierarchical temporal memory (HTM) theory. The sequential sensor data representing a certain skill from a RGB-D camera are received and then encoded as a sequence of Sparse Distributed Representation (SDR) vectors. The sequential SDR vectors are treated as the inputs of the perception-action HTM. The HTM learns sequences of SDRs and makes predictions of what the next input SDR will be. It stores the transitions of the current perceived sensor data and next predicted actions. We evaluated the performance of this proposed framework for learning the shaking hands skill on a humanoid NAO robot. The experimental results manifest that the skill learning method designed in this paper is promising.
AB - Skill learning autonomously through interactions with the environment is a crucial ability for intelligent robot. A perception-action integration or sensorimotor cycle, as an important issue in imitation learning, is a natural mechanism without the complex program process. Recently, neurocomputing model and developmental intelligence method are considered as a new trend for implementing the robot skill learning. In this paper, based on research of the human brain neocortex model, we present a skill learning method by perception-action integration strategy from the perspective of hierarchical temporal memory (HTM) theory. The sequential sensor data representing a certain skill from a RGB-D camera are received and then encoded as a sequence of Sparse Distributed Representation (SDR) vectors. The sequential SDR vectors are treated as the inputs of the perception-action HTM. The HTM learns sequences of SDRs and makes predictions of what the next input SDR will be. It stores the transitions of the current perceived sensor data and next predicted actions. We evaluated the performance of this proposed framework for learning the shaking hands skill on a humanoid NAO robot. The experimental results manifest that the skill learning method designed in this paper is promising.
UR - http://www.scopus.com/inward/record.url?scp=85042223042&partnerID=8YFLogxK
U2 - 10.1155/2017/7948684
DO - 10.1155/2017/7948684
M3 - Journal article
AN - SCOPUS:85042223042
SN - 1076-2787
VL - 2017
JO - Complexity
JF - Complexity
M1 - 7948684
ER -