Font Size: a A A

Research On Visual Guidance And Control Method For UR Robot Based On Gesture Understanding

Posted on:2018-05-04Degree:MasterType:Thesis
Country:ChinaCandidate:Y H LiuFull Text:PDF
GTID:2348330512456968Subject:Mechanical engineering
Abstract/Summary:PDF Full Text Request
While robot technology has been applied extensively in the field of industrial manufacturing,military operations and medical treatment,the interaction between human and robot is more toward the "human-oriented" direction.The traditional interactive mode based on the mouse,keyboard and operation panel is more and more unable to meet the needs of human-computer interaction.So,the vision guided control based on gesture understanding has become a new tendency of humancomputer interaction model.The main content of this paper is the visual guidance and control method for UR robot based on gesture understanding.A comprehensive study of gesture recognition method,gesture tracking method and UR robot remote control method is conducted.The human-computer interaction system based on gesture recognition and tracking is established.In the aspect of gesture recognition,a recognition method based on Viola-Jones algorithm and skin color segmentation is proposed in order to improve the detection accuracy and reduce the error rate.Firstly,the skin segmentation module is used to remove most of the non-skin color regions in the background,and then the gesture detector,which is trained by Viola-Jones algorithm,is used to recognize the gesture.In the skin color segmentation module,the skin color model in YCb Cr color space is proposed to segment the skin color region from the whole image,and then the noise is cleared by morphological filtering operation.In the Viola-Jones algorithm,the Haar feature,integral image and the cascade architecture are used to train the detector to recognize three target gestures.The result of the contrast test shows that the method's performance is more ideal than the conventional detector and meets the requirements of human-computer interaction system.In the aspect of gesture tracking,the improved Shi-Tomasi feature point extraction algorithm and the tracker contained two modules are combined to track the gesture.The feature points,which are sensitive to the noise and not distributed on the gesture,are eliminated by the improved Shi-Tomasi feature point extraction algorithm.The reliable and stable feature points are fed into the tracker to realize the orientation of the gesture target.When the feature points are lost,the position of the gesture can be predicted by the Kalman filter.The efficient detection and continuous tracking is achieved because of the predicted detection range.The problem of tracking failure and discontinuous guiding signal due to occlusion or overlapping can be solved.In the aspect of remote control of UR robot,an in-depth analysis of the motion control mechanism of UR robot is carried out.A remote control method for UR robot is designed aiming to solve the remote control problem and verified through the MATLAB platform.The end of the robot tracks the sine trajectory successfully and the condition monitoring of the robot is completed,which verifies the correctness and effectiveness of the method.Finally,the gesture recognition algorithm,the gesture tracking algorithm and the robot remote control method are applied to establish a human-computer interaction system based on user's gesture.The robot's movement is guided by the recognition and tracking of the gesture,which can accomplish a real-time and friendly interaction with the UR manipulator platform.The guidance control of UR robot based on visual gesture understanding is achieved.
Keywords/Search Tags:Visual guidance control, UR Robot, Gesture Recognition, Gesture tracking, Shi-Tomasi algorithm, KLT algorithm
PDF Full Text Request
Related items