ResearchSpace

Cascading neural networks for upper-body gesture recognition

Show simple item record

dc.contributor.author Mangera, R
dc.contributor.author Senekal, F
dc.contributor.author Nicolls, F
dc.date.accessioned 2014-12-03T06:56:23Z
dc.date.available 2014-12-03T06:56:23Z
dc.date.issued 2014
dc.identifier.citation Mangera, R, Senekal, F and Nicolls, F. 2014. Cascading neural networks for upper-body gesture recognition. In: Proceedings of the International Conference on Machine Vision and Machine Learning, Prague, Czech Republic, 14-15 August 2014 en_US
dc.identifier.uri http://hdl.handle.net/10204/7801
dc.description Proceedings of the International Conference on Machine Vision and Machine Learning, Prague, Czech Republic, 14-15 August 2014 en_US
dc.description.abstract Gesture recognition has many applications ranging from health care to entertainment. However for it to be a feasible method of human-computer interaction it is essential that only intentional movements are interpreted and that the system can work for a wide variety of users. To date very few systems have been tested for the real-world where users are inexperienced in gesture performance resulting in data which is noisier in terms of gesture-starts, gesture motion and gesture-ends. In addition, few systems have taken into consideration the dominant hand used when performing gestures. The work presented in this paper takes this into consideration by firstly selecting key-frames from a gesture sequence then cascading neural networks for left and right gesture classification. The first neural network determines which hand is being used for gesture performance and the second neural network then recognises the gesture. The performance of the system is tested using the VisApp2013 gesture dataset which consists of four left and right hand gestures. This dataset is unique in that the test gesture samples have been performed by untrained users to simulate a real-world environment. By key-frame selection and cascading neural networks the system accuracy improves from 79.8% to 95.6%. en_US
dc.language.iso en en_US
dc.publisher Avestia Publishing en_US
dc.relation.ispartofseries Workflow;13821
dc.subject Gesture recognition en_US
dc.subject Depth sensor en_US
dc.subject Neural networks en_US
dc.title Cascading neural networks for upper-body gesture recognition en_US
dc.type Book Chapter en_US
dc.identifier.apacitation Mangera, R., Senekal, F., & Nicolls, F. (2014). Cascading neural networks for upper-Body gesture recognition., <i>Workflow;13821</i> Avestia Publishing. http://hdl.handle.net/10204/7801 en_ZA
dc.identifier.chicagocitation Mangera, R, F Senekal, and F Nicolls. "Cascading neural networks for upper-body gesture recognition" In <i>WORKFLOW;13821</i>, n.p.: Avestia Publishing. 2014. http://hdl.handle.net/10204/7801. en_ZA
dc.identifier.vancouvercitation Mangera R, Senekal F, Nicolls F. Cascading neural networks for upper-body gesture recognition.. Workflow;13821. [place unknown]: Avestia Publishing; 2014. [cited yyyy month dd]. http://hdl.handle.net/10204/7801. en_ZA
dc.identifier.ris TY - Book Chapter AU - Mangera, R AU - Senekal, F AU - Nicolls, F AB - Gesture recognition has many applications ranging from health care to entertainment. However for it to be a feasible method of human-computer interaction it is essential that only intentional movements are interpreted and that the system can work for a wide variety of users. To date very few systems have been tested for the real-world where users are inexperienced in gesture performance resulting in data which is noisier in terms of gesture-starts, gesture motion and gesture-ends. In addition, few systems have taken into consideration the dominant hand used when performing gestures. The work presented in this paper takes this into consideration by firstly selecting key-frames from a gesture sequence then cascading neural networks for left and right gesture classification. The first neural network determines which hand is being used for gesture performance and the second neural network then recognises the gesture. The performance of the system is tested using the VisApp2013 gesture dataset which consists of four left and right hand gestures. This dataset is unique in that the test gesture samples have been performed by untrained users to simulate a real-world environment. By key-frame selection and cascading neural networks the system accuracy improves from 79.8% to 95.6%. DA - 2014 DB - ResearchSpace DP - CSIR KW - Gesture recognition KW - Depth sensor KW - Neural networks LK - https://researchspace.csir.co.za PY - 2014 T1 - Cascading neural networks for upper-body gesture recognition TI - Cascading neural networks for upper-body gesture recognition UR - http://hdl.handle.net/10204/7801 ER - en_ZA


Files in this item

This item appears in the following Collection(s)

Show simple item record