Gesture recognition has become a popular area of research with applications in medical systems, assistive technologies, entertainment, crisis management, disaster relief and human-machine interaction. This paper presents a static gesture recognition system which uses an Asus Xtion Pro Live sensor to obtain the skeletal model of the user. Typically, joint angles and joint positions have been used as features. However these features do not adequately divide the gesture space, resulting in non-optimal classification accuracy. Therefore to improve the classification accuracy, a new feature vector, combining joint angles and the relative position of the arm joints with respect to the head, is proposed. A k-means classifier is used to cluster each gesture. New gestures are classified using a Euclidean distance metric. The new feature vector is evaluated on a 10 static gesture dataset, consisting of 7 participants. The vector containing only joint angles achieves a classification accuracy of 91.98%. In contrast, the new feature vector containing both joint angles and the relative positions of the arm joint with respect to the head achieves a classification accuracy of over 99%.
Reference:
Mangera, R. 2013. Static gesture recognition using features extracted from skeletal data. In: Proceedings of the Twenty-Fourth Annual Symposium of the Pattern Recognition Association of South Africa, Aukland Park, 3 December 2013
Mangera, R. (2013). Static gesture recognition using features extracted from skeletal data. PRASA 2013 Proceedings. http://hdl.handle.net/10204/7222
Mangera, R. "Static gesture recognition using features extracted from skeletal data." (2013): http://hdl.handle.net/10204/7222
Mangera R, Static gesture recognition using features extracted from skeletal data; PRASA 2013 Proceedings; 2013. http://hdl.handle.net/10204/7222 .
Proceedings of the Twenty-Fourth Annual Symposium of the Pattern Recognition Association of South Africa, Aukland Park, 3 December 2013. Post print uploaded.