Font Size: a A A

Research On Dynamic Gesture Recognition Based On 3D Bone Data

Posted on:2020-04-12Degree:MasterType:Thesis
Country:ChinaCandidate:L Y PiFull Text:PDF
GTID:2428330575465551Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
The traditional way of human-computer interaction is often to operate the computer by using the mouse and keyboard to input various instructions.This interaction lasts for a long time and is still the mainstream interaction in today's society.With the advancement of technology and the maturity of multimedia information technology,new ways of human-computer interaction have gradually entered the public,affecting and changing people's daily lives.At present,novel technologies such as gesture recognition,speech recognition,motion recognition and expression recognition based on human-computer interaction have developed rapidly.They have been widely applied to many fields such as mobile internet,communication,game entertainment,smart home,virtual reality,etc.,and for the future.The development direction of human-computer interaction provides a guiding idea.Gestures can be used as an effective means of natural human-computer interaction because of the richness and diversity of information they can convey.This paper studies dynamic gesture recognition based on two different somatosensory devices,and designs gesture interaction experiments to explore the feasibility of bare hands interaction in virtual environment.The main work of this paper is:1)Considering that the current open source gesture database is relatively small,and the types of gestures in the open source database are not rich,a large dynamic gesture database is established.The database contains 27 dynamic gestures,using two devices(Leap Motion and Intel RealSense)to collect data,a total of 88 people involved in the recording of the database.2)For the gesture skeleton node data acquired by Leap Motion,the traditional feature extraction method is improved,and three different types of features,such as structure,sequence and space,are extracted and combined for gesture recognition.The support vector machine(SVM)algorithm was used to complete the recognition experiment of 13 gestures in the established database.The results show that the recognition rate of up to 97.5% can be achieved by using this method.A new feature extraction method is proposed for the bone data obtained by RealSense.The hand shape is described by calculating 22 joint points of the adjacent two nodes,and the palm node trajectory is extracted to calculate the moving direction of the hand in each frame.In order to eliminate the influence of sequence time factors,the t hree layers of the Temporal Pyramid model are used to layer the sequences,and the two features are extracted on each subsequence.The SVM algorithm is also used to classify 13 gestures,and the highest recognition rate can reach 95.0%.3)Design virtual gesture interaction experiments based on HTC VIVE and Leap Motion on the hardware and based on Unity3 D and MiddleVR for Unity in software.Firstly,the coordinates of the hand bone nodes are obtained in real time through Leap Motion.Then,using the ray detection principle,the Update()method in the Unity3 D script is used to detect the information of the collided object by ray in each frame,and the relative distance of the relevant nodes is calculated to determine.The state of the current gesture,enabling interaction with the surrounding virtual model.
Keywords/Search Tags:gesture recognition, database, SVM, feature extraction, virtual gesture interaction
PDF Full Text Request
Related items