Font Size: a A A

Object Tracking And Gesture Recognition For Assembly Learning From Demonstration

Posted on:2016-09-27Degree:MasterType:Thesis
Country:ChinaCandidate:H Y PangFull Text:PDF
GTID:2308330461952701Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
With the development of the robot technology, Robot has already penetrated into our life and manufacture, More natural ability of learning from demonstration becomes a research hotspot in academic world. As the main source which robot uses to accept the information in outside, Vision based action recognition becomes one of the base tasks in learning from demonstration. Because of the feature of multi-layer and context-sensitive in human behavior, how to do action recognition combining the nature around has important research significance.Our research is based on the intelligent assembly robot’s learning from demonstration, aiming at robots’ exactly operation motion recognition, we do some research about object tracking, static gestures detection and recognition, dynamic gestures recognition. The main contributions are as follows:1. An algorithm combining global target descriptor and local template update is proposed to realize the object tracking. The algorithm’s overall framework is bayesian parameter esti-mation which is made up of two main parts. In observation model we extract the target’s Haar feature and use RIP matrix to realize the dimensionality reduction. In dynamic model as to the particles generated by particle filter, we choose the maximum posterior particle as our tracking result. The posterior of the particles is calculated by the naive bayes algorithm whose parameter is learned through the templates in our template set. We use sparse rep-resentation to reconstruction the each patch in the template. According to the error during the reconstruction we can detect and reject the patches which are sheltered by the object in the environment to avoid the tracking drift. We evaluate our tracking algorithm on differ-ent sequence. The result show that the method proposed performs well in terms of target occlusion, similar environment and illumination change and so on2. An system containing hand detection and static hand gestures recognition is proposed. Based on the analysis of the skin pixel distribution, A hand detection and segmentation algorithm is proposed in YCbCr space. Hand’s gesture is described by the pyramid histogram of gradient and zernike moment. At the same time, we use kernel principal component analysis to extract the nonlinear feature in the hand gesture image. Combining all of the feature above, A dataset containing 11 kinds of hand gesture has already been collected to test our algorithm. The classification precision using the feature above is 98% on nearest neighbor.3. An operational motion recognition system which is based on the hidden markov model is proposed. The observation of the model comes from the hand gesture feature extracted from each of the frame in action video, we use EM to get the model’s paremeter, where each model represents a kind of operational motion. Using these models, we choose the model that outputs the maximum posterior probability as the label, also we use the rejection threshold to reduce the model’s false accept rate. Using the algorithm above, We classify four kinds of usual assembly action which is push, put in, take out and press. Some operational motion may have different meanings even they have the same observation. With regard to this situation, we extract the trajectory of the object and calculate the motion vector through it. With the help of the object motion vector, we classify the different task.
Keywords/Search Tags:Industrial Robot, Learning From Demonstration, Object Tracking, Hand Gesture Recog- nition
PDF Full Text Request
Related items