ResearchSpace

Static gesture recognition using features extracted from skeletal data

Show simple item record

dc.contributor.author Mangera, R
dc.date.accessioned 2014-02-19T07:40:58Z
dc.date.available 2014-02-19T07:40:58Z
dc.date.issued 2013-12
dc.identifier.citation Mangera, R. 2013. Static gesture recognition using features extracted from skeletal data. In: Proceedings of the Twenty-Fourth Annual Symposium of the Pattern Recognition Association of South Africa, Aukland Park, 3 December 2013 en_US
dc.identifier.uri http://www.prasa.org/proceedings/2013/prasa2013-09.pdf
dc.identifier.uri http://hdl.handle.net/10204/7222
dc.description Proceedings of the Twenty-Fourth Annual Symposium of the Pattern Recognition Association of South Africa, Aukland Park, 3 December 2013. Post print uploaded. en_US
dc.description.abstract Gesture recognition has become a popular area of research with applications in medical systems, assistive technologies, entertainment, crisis management, disaster relief and human-machine interaction. This paper presents a static gesture recognition system which uses an Asus Xtion Pro Live sensor to obtain the skeletal model of the user. Typically, joint angles and joint positions have been used as features. However these features do not adequately divide the gesture space, resulting in non-optimal classification accuracy. Therefore to improve the classification accuracy, a new feature vector, combining joint angles and the relative position of the arm joints with respect to the head, is proposed. A k-means classifier is used to cluster each gesture. New gestures are classified using a Euclidean distance metric. The new feature vector is evaluated on a 10 static gesture dataset, consisting of 7 participants. The vector containing only joint angles achieves a classification accuracy of 91.98%. In contrast, the new feature vector containing both joint angles and the relative positions of the arm joint with respect to the head achieves a classification accuracy of over 99%. en_US
dc.language.iso en en_US
dc.publisher PRASA 2013 Proceedings en_US
dc.relation.ispartofseries Workflow;11891
dc.subject Gesture recognition en_US
dc.subject Skeletal data en_US
dc.subject Asus Xtion Pro en_US
dc.subject Depth sensors en_US
dc.title Static gesture recognition using features extracted from skeletal data en_US
dc.type Conference Presentation en_US
dc.identifier.apacitation Mangera, R. (2013). Static gesture recognition using features extracted from skeletal data. PRASA 2013 Proceedings. http://hdl.handle.net/10204/7222 en_ZA
dc.identifier.chicagocitation Mangera, R. "Static gesture recognition using features extracted from skeletal data." (2013): http://hdl.handle.net/10204/7222 en_ZA
dc.identifier.vancouvercitation Mangera R, Static gesture recognition using features extracted from skeletal data; PRASA 2013 Proceedings; 2013. http://hdl.handle.net/10204/7222 . en_ZA
dc.identifier.ris TY - Conference Presentation AU - Mangera, R AB - Gesture recognition has become a popular area of research with applications in medical systems, assistive technologies, entertainment, crisis management, disaster relief and human-machine interaction. This paper presents a static gesture recognition system which uses an Asus Xtion Pro Live sensor to obtain the skeletal model of the user. Typically, joint angles and joint positions have been used as features. However these features do not adequately divide the gesture space, resulting in non-optimal classification accuracy. Therefore to improve the classification accuracy, a new feature vector, combining joint angles and the relative position of the arm joints with respect to the head, is proposed. A k-means classifier is used to cluster each gesture. New gestures are classified using a Euclidean distance metric. The new feature vector is evaluated on a 10 static gesture dataset, consisting of 7 participants. The vector containing only joint angles achieves a classification accuracy of 91.98%. In contrast, the new feature vector containing both joint angles and the relative positions of the arm joint with respect to the head achieves a classification accuracy of over 99%. DA - 2013-12 DB - ResearchSpace DP - CSIR KW - Gesture recognition KW - Skeletal data KW - Asus Xtion Pro KW - Depth sensors LK - https://researchspace.csir.co.za PY - 2013 T1 - Static gesture recognition using features extracted from skeletal data TI - Static gesture recognition using features extracted from skeletal data UR - http://hdl.handle.net/10204/7222 ER - en_ZA


Files in this item

This item appears in the following Collection(s)

Show simple item record