TY - JOUR
T1 - A Combination Model of Shifting Joint Angle Changes With 3D-Deep Convolutional Neural Network to Recognize Human Activity
AU - Rahayu, Endang Sri
AU - Yuniarno, Eko Mulyanto
AU - Purnama, I. Ketut Eddy
AU - Purnomo, Mauridhi Hery
N1 - Publisher Copyright:
© 2001-2011 IEEE.
PY - 2024
Y1 - 2024
N2 - Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of movements. Current recognition methods rely on calculating changes in joint distance to classify activity patterns. Therefore, a different approach is required to identify the direction of movement to distinguish activities exhibiting similar joint distance changes but differing motion directions, such as sitting and standing. The research conducted in this study focused on determining the direction of movement using an innovative joint angle shift approach. By analyzing the joint angle shift value between specific joints and reference points in the sequence of activity frames, the research enabled the detection of variations in activity direction. The joint angle shift method was combined with a Deep Convolutional Neural Network (DCNN) model to classify 3D datasets encompassing spatial-temporal information from RGB-D video image data. Model performance was evaluated using the confusion matrix. The results show that the model successfully classified nine activities in the Florence 3D Actions dataset, including sitting and standing, obtaining an accuracy of (96.72 ± 0.83)%. In addition, to evaluate its robustness, this model was tested on the UTKinect Action3D dataset, obtaining an accuracy of 97.44%, proving that state-of-the-art performance has been achieved.
AB - Research in the field of human activity recognition is very interesting due to its potential for various applications such as in the field of medical rehabilitation. The need to advance its development has become increasingly necessary to enable efficient detection and response to a wide range of movements. Current recognition methods rely on calculating changes in joint distance to classify activity patterns. Therefore, a different approach is required to identify the direction of movement to distinguish activities exhibiting similar joint distance changes but differing motion directions, such as sitting and standing. The research conducted in this study focused on determining the direction of movement using an innovative joint angle shift approach. By analyzing the joint angle shift value between specific joints and reference points in the sequence of activity frames, the research enabled the detection of variations in activity direction. The joint angle shift method was combined with a Deep Convolutional Neural Network (DCNN) model to classify 3D datasets encompassing spatial-temporal information from RGB-D video image data. Model performance was evaluated using the confusion matrix. The results show that the model successfully classified nine activities in the Florence 3D Actions dataset, including sitting and standing, obtaining an accuracy of (96.72 ± 0.83)%. In addition, to evaluate its robustness, this model was tested on the UTKinect Action3D dataset, obtaining an accuracy of 97.44%, proving that state-of-the-art performance has been achieved.
KW - Combination model
KW - deep convolutional neural network
KW - human activity recognition
KW - shifting joint angles
UR - http://www.scopus.com/inward/record.url?scp=85186987183&partnerID=8YFLogxK
U2 - 10.1109/TNSRE.2024.3371474
DO - 10.1109/TNSRE.2024.3371474
M3 - Article
C2 - 38421841
AN - SCOPUS:85186987183
SN - 1534-4320
VL - 32
SP - 1078
EP - 1089
JO - IEEE Transactions on Neural Systems and Rehabilitation Engineering
JF - IEEE Transactions on Neural Systems and Rehabilitation Engineering
ER -