Please use this identifier to cite or link to this item: http://localhost:8081/jspui/handle/123456789/16552
Title: STATIC HAND GESTURE RECOGNITION USING DEEP LEARNING
Authors: Jain, Naman
Keywords: Gesture Recognition;Recognize Meaningful;Static Human Gesture;Neural Networks
Issue Date: May-2017
Publisher: I I T ROORKEE
Abstract: Gesture recognition, in very technical terms, is a technique to recognize meaningful expressions of motion by a human, involving the hands, arms, face, head, and/or body.Hand gesture for communication has proven effective for humans, and active research is ongoing in replicating the same success in this field as in other computer vision systems. Human–computer interaction can be significantly improved from advances in systems that are capable of recognizing different hand gestures. Gesture recognition has been broadly divided into two categories namely, static recognition and dynamic recognition. This report presents, discusses, and sheds some light on the techniques used in Static Human gesture recognition. One of the bottlenecks in recognizing the static gesture occurs when images are scaled, rotated and translated. In this report a particular focus has been laid on the removing this limitation and making feature invariant of aforementioned factors. Some of the existing techniques which try to solve this problem, their method and limitations have been discussed. This report presents a more sophisticated and robust method which uses Convolutional Neural Networks. In the work presented, experiments were done with different models of CNN by varying certain parameters. The performance of each model was evaluated on the Thomas Moeslund’s static hand gesture database and their accuracy score is calculated to show the veracity of these models
URI: http://localhost:8081/jspui/handle/123456789/16552
metadata.dc.type: Other
Appears in Collections:MASTERS' THESES (E & C)

Files in This Item:
File Description SizeFormat 
G27568.pdf1.66 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.