Font Size: a A A

Dynamic Gesture Recognition Based On The Leap Motion Controller

Posted on:2021-05-17Degree:MasterType:Thesis
Country:ChinaCandidate:J A ChenFull Text:PDF
GTID:2428330611497523Subject:Mechanical engineering
Abstract/Summary:PDF Full Text Request
As one of the significant ways of human-computer interaction,the dynamic gesture recognition technology has been emphasized many times in the "National Key R & D Program" and attracted wide attention from various fields.In the past,most researches and applications used common cameras to capture dynamic gestures,extracting hand features,but these features were usually accompanied with lots of noise and did not represent dynamic gestures well.Nowadays,with the advent of some depth vision sensors like Leap Motion,the reliable hand features can be achieved for the dynamic gesture recognition.However,based on Leap Motion,the recognition accuracies achieved by domestic and foreign researchers in the recognition of the American Sign Language and daily gestures are not high.Therefore,this thesis aims to improve the recognition accuracy of these dynamic gestures,and proposes a recognition system based on the Leap Motion and bidirectional Long Short-Term Memory networks.The main research aspects include which gesture features are selected to represent the dynamic gestures,how to effectively capture the dynamic gestures,and how to design an appropriate recognition model.First,for which gesture features are selected to represent gestures,a set of gesture feature vectors are proposed that can effectively represent the dynamic gestures.Each vector in the vector group is composed of 26 gesture feature values.The 26 gesture feature values can be divided into six categories,which are the distance between the fingertip and the palm center,the angle between the finger and the palm plane,the distance between the fingertip and the palm plane,the distance between the two adjacent fingertips,the angle between the two adjacent fingers and the palm,and the palm coordinates.In addition,the number of vectors in the vector group changes with the execution time of the dynamic gestures.Next,for how to effectively capture the dynamic gestures,an effective dynamic gesture acquisition algorithm is designed.By detecting the three-dimensional rotation angle of the palm plane between the current frame and the previous frame and the speed of the finger of the current frame in the Leap Motion,the start and end points of the dynamic gesture are determined.When the rotation angle of the palm in the three-dimensional coordinate system of the sensor is greater than the corresponding threshold or the speed of the fingertip is greater than the corresponding threshold,it is determined that the dynamic gesture execution starts and the sensor begins to transmit gesture feature values;when the rotation angle of the palm and the speed of the fingertip are both less than or equal to corresponding thresholds,it is determined that the dynamic gesture execution ends and the sensor stops transmitting gesture feature values.By acquisition algorithms and gesture feature values,three dynamic gesture datasets are built,namely American Sign Language dataset with 360 samples,American Sign Language dataset with 480 samples,and Handicraft-Gesture dataset with 300 samples.Thirdly,for how to design an appropriate recognition model,a novel dynamic gesture recognition model is presented,which is a two-layer bidirectional Long Short-Term Memory network.This model is mainly composed of two bidirectional Long Short-Term Memory networks stacked vertically and contains Gaussian initialized network parameters,cyclical learning rate,random dropout and Adam gradient descent.Finally,the three dynamic gesture datasets are divided into training and testing sets according to the ratio of 7:3,6:4,and 7:3 respectively,and the recognition system is tested,verified,and compared.On the American Sign Language dataset with 360 samples,the system achieves accuracies of 100% and 96.3% on the training and testing sets.On the American Sign Language dataset with 480 samples,the system achieves accuracies of 100% and 95.2% on the training and testing sets.On the Handicraft-Gesture dataset,the system achieves accuracies of 100% and 96.7% on the training and testing sets.In addition,5-fold,10-fold,and Leave-One-Out cross-validation are performed on these datasets.In the 5-fold cross-validation,the results are 93.33%,93.75%,and 88.66% on American Sign Language dataset with 360 samples,American Sign Language dataset with 480 samples and Handicraft-Gesture dataset,respectively.In the 10-fold cross-validation,the results are 94.1%,93.5%,and 90% on three datasets respectively.In the Leave-One-Out cross-validation,the results are 94.7%,96.4%,and 90.3% on three datasets respectively.What's more,under the same conditions,the experimental results are compared with domestic and foreign research results.The comparison shows that the recognition accuracies of the system are higher than that of domestic and foreign researches.
Keywords/Search Tags:Dynamic gesture recognition, Leap Motion, Gesture feature, Long Short-Term Memory network, Cross-validation
PDF Full Text Request
Related items