Font Size: a A A

Somatosensory Technology Of Human Computer Interaction With Space Robot

Posted on:2016-08-04Degree:MasterType:Thesis
Country:ChinaCandidate:Y T FanFull Text:PDF
GTID:2348330488974279Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
As one of the devices people are indispensable in today's work and life, the computer technology has made a lot of achievements in the last half century, such as smaller and smaller in size, more powerful computing capacity, data sharing and exchange increasingly more frequently, etc. However, the way of human-computer interaction is still not made radical changes, it is still based on mouse and keyboard-based, which is unfavorable for the computer toward the intelligent, convenient and humane direction. Somatosensory interaction is a new type of interactive technology to exchange and commun icate with the device or scenes around. It has achieved good application effect in the field of somatosensory game, health care, three-dimensional in virtual reality, space mouse, motion detection and so on. Therefore, human-computer interaction technology of somatosensory interaction is a research topic worth exploring.In this paper, gesture recognition and body recognition algorithms in somatosensory interaction are researched and explored, and these are combined with the manipulator control, constitutes a virtual space environment robotic arm interactive system for interactive presentations. This article includes the following:1. Gesture detection and segmentation. After the image depth information obtained by the Kinect sensor, we use Kinect SDK for user tracking, position your hands get a threshold process, and then make some noise reduction processing, segmentation results obtained with both hands, effectively ruled out the background and interference from other users.2. Fingertip recognition. The convex hull convex defect law, curve analysis and three-point detection method three kinds of finger recognition algorithms were carried out to achieve, and compare its performance. For the best three-point detection method, we also made a further improvement, the number of undetected error detection fingertips to further reduce and expand the fingertip description dimension, fingertip extraction results have broader and more universally applicable sex.3. Static gesture recognition. HO G descriptors were selected based on SURF algorithm BOW descriptor static characteristic gesture, said the effect of learning to identify, test, training, and the use of both multi-classification SVM. BOW-based algorithm for SURF descriptor, the paper conducted several experiments to determine the optimum parameters in terms of timeliness, recognition rate.4. Dynamic gesture recognition. It proposed a simple and practical method of dynamic gesture segmentation, namely through the palm of the residence time to determine the beginning and end of the dynamic gesture; for dynamic gesture trajectory, also proposed normalization algorithm for different users track sequence normalized, reduce the differences between users; Finally DTW template matching algorithm to identify dynamic gesture was highly dynamic gesture recognition results quickly.5. Limb identified. Changes with joint angle and to describe the user's body movements, reducing the complexity of the information, but also enhance the action of identifiability, and smoothed user joint angles Kalman filter algorithm eliminates the interaction of many jitter make interaction more natural.6. Implementation of interactive mode and design of interactive scene. The establishment of a manipulator model, to achieve a robotic arm interactive scenes, and designs contains static and dynamic gestures of the hands, fingers, limbs interactive mode, to achieve the work carried out by the somatosensory interactive control a virtual arm of the simulation.Finally, according to the relevant algorithm extracts prepared a related interactive software, the software is based on the development of MFC dialog boxes, you can achieve the dynamic and static gesture recognition, fingertip testing, physical identification, and interactive features such as manipulator.
Keywords/Search Tags:human-computer interaction, somatosensory interaction, gesture recognition, robot control
PDF Full Text Request
Related items