TY - JOUR
T1 - Efficient body motion quantification and similarity evaluation using 3-D joints skeleton coordinates
AU - Aouaidjia, Kamel
AU - Sheng, Bin
AU - Li, Ping
AU - Kim, Jinman
AU - Feng, David Dagan
N1 - Funding Information:
Manuscript received December 4, 2018; revised February 21, 2019; accepted May 4, 2019. Date of publication May 31, 2019; date of current version April 15, 2021. This work was supported in part by the National Natural Science Foundation of China under Grant 61872241 and Grant 61572316, in part by the National Key Research and Development Program of China under Grant 2017YFE0104000 and Grant 2016YFC1300302, in part by the Macau Science and Technology Development Fund under Grant 0027/2018/A1, and in part by the Science and Technology Commission of Shanghai Municipality under Grant 18410750700, Grant 17411952600, and Grant 16DZ0501100. This paper was recommended by Associate Editor R. Roberts. (Corresponding author: Bin Sheng.) A. Kamel is with the Department of Computer Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240, China.
Publisher Copyright:
© 2013 IEEE.
PY - 2021/5
Y1 - 2021/5
N2 - Evaluating whole-body motion is challenging because of the articulated nature of the skeleton structure. Each joint moves in an unpredictable way with uncountable possibilities of movements direction under the influence of one or many of its parent joints. This paper presents a method for human motion quantification via three-dimensional (3-D) body joints coordinates. We calculate a set of metrics that influence the joints movement considering the motion of its parent joints without requiring prior knowledge of the motion parameters. Only the raw joints coordinates data of a motion sequence are needed to automatically estimate the transformation matrix of the joints between frames. We also consider the angles between limbs as a fundamental factor to follow the joints directions. We classify the joints motion as global motion and local motion. The global motion represents the joint movement according to a fixed joint, and the local motion represents the joint movement according to its first parent joint. In order to evaluate the performance of the proposed method, we also propose a comparison algorithm between two skeletons motions based on the quantified metrics. We measured the comparative similarity between the 3-D joints coordinates on Microsoft Kinect V2 and UTD-MHAD dataset. User studies were conducted to evaluate the performance under different factors. Various results and comparisons have shown that our method effectively quantifies and evaluates the motion similarity.
AB - Evaluating whole-body motion is challenging because of the articulated nature of the skeleton structure. Each joint moves in an unpredictable way with uncountable possibilities of movements direction under the influence of one or many of its parent joints. This paper presents a method for human motion quantification via three-dimensional (3-D) body joints coordinates. We calculate a set of metrics that influence the joints movement considering the motion of its parent joints without requiring prior knowledge of the motion parameters. Only the raw joints coordinates data of a motion sequence are needed to automatically estimate the transformation matrix of the joints between frames. We also consider the angles between limbs as a fundamental factor to follow the joints directions. We classify the joints motion as global motion and local motion. The global motion represents the joint movement according to a fixed joint, and the local motion represents the joint movement according to its first parent joint. In order to evaluate the performance of the proposed method, we also propose a comparison algorithm between two skeletons motions based on the quantified metrics. We measured the comparative similarity between the 3-D joints coordinates on Microsoft Kinect V2 and UTD-MHAD dataset. User studies were conducted to evaluate the performance under different factors. Various results and comparisons have shown that our method effectively quantifies and evaluates the motion similarity.
KW - Human-computer interaction
KW - motion quantification
KW - similarity evaluation
KW - three-dimensional (3-D) human motion representation
UR - http://www.scopus.com/inward/record.url?scp=85104445187&partnerID=8YFLogxK
U2 - 10.1109/TSMC.2019.2916896
DO - 10.1109/TSMC.2019.2916896
M3 - Journal article
AN - SCOPUS:85104445187
SN - 2168-2216
VL - 51
SP - 2774
EP - 2788
JO - IEEE Transactions on Systems, Man, and Cybernetics: Systems
JF - IEEE Transactions on Systems, Man, and Cybernetics: Systems
IS - 5
ER -