Font Size: a A A

Research On Visual-Tactile Fusion Grasping For Space Telemanipulation Robot

Posted on:2023-02-10Degree:MasterType:Thesis
Country:ChinaCandidate:S X ShenFull Text:PDF
GTID:2568307061458794Subject:Measurement technology and equipment
Abstract/Summary:PDF Full Text Request
With space exploration go increasingly further,space robot teleoperation technology has been applied to large space equipment such as space shuttles,space stations,and satellites,replacing astronauts to complete space tasks such as exploration and maintenance in dangerous environments.In the process of space robot operations,grasping and manipulating target objects is the basis for completing complex tasks,and sensing the information of target objects is the premise for stable grasping and manipulation.Vision and tactile perception are important sensing modalities for robot perception systems.In a complex space environment,the two sensing modalities complement each other to obtain information such as the geometric characteristics,physical characteristics and grasping states of the target to assist in teleoperation grasping operations.In this paper,under the background of large-time-latency space teleoperation based on virtual environment,visual and tactile fusion perception and grasping control are studied to improve the grasping performance of space teleoperation.To facilitate the research on vision-tactile fusion grasping for space teleoperation robots,a space teleoperation ground verification system based on virtual environment is designed and built as an experimental platform.The experimental platform consists of a force feedback hand controller,a Schunk manipulator,a Barrett dexterous hand,and visual and tactile sensing modules.Firstly,the kinematic calculation of the hand controller,the manipulator and the dexterous hand are made to realize the remote operation control of manipulator at the slave side by heterogeneous hand controller at the master side.Secondly,for virtual environment modeling,a method of model import is proposed for pre-defined environment,and a method of point cloud registration is proposed for unstructured environment.Finally,the design of the perception system and the realization of the control system based on ROS at the slave side are completed.For recognition of pre-defined target in virtual environment modeling under complex spatial environment,an autonomous acquisition and recognition module of visual and tactile information is constructed at the slave side of the system.Firstly,point cloud positioning and path planning are carried out based on visual perception to realize the autonomous acquisition of visual and tactile information of the object in the workspace.Secondly,research of combined CNN-GRU neural network is conducted for the characteristics of visual and tactile information,and the model is trained and verified based on the self-built visual-tactile dataset consisting of14 kinds of similar objects in complex environments.The accurate recognition of targets is achieved.To achieve hardness perception of target object and dexterous hand grasping control,a three-class SVM model based on tactile perception and a grip strength hierarchical admittance control strategy are researched.Dataset consisting of 15 common objects hardness degree tactile is built to train the model,and 7 objects outside the dataset are used to verify the model to achieve hardness classification of target object.On this basis,grasping experiment of the grip strength hierarchical admittance control strategy is carried out,which realizes the accurate control of the grip strength when facing objects with different hardness and verifies the effectiveness of the control strategy.For the target grasping task under large-time-latency space teleoperation,the grasping state judgment based on visual and tactile perception and grasping force threshold adaptive adjustment strategy is researched,and the teleoperation partial semi-autonomous grasping system is constructed based on the research of this paper.Firstly,a C3D-GRU combined network model based on visual and haptic fusion is constructed,and Dataset consisting of 11 common objects is built to train and verify the model,so as to realize the online recognition of grasping state.On the basis of grasping state perception and hierarchical admittance control,an adaptive adjustment strategy of grip strength threshold is designed.Grasping experiments are performed under different initial grasping forces,and appropriate grip strength threshold selection and stable grasp control are achieved.Finally,the teleoperation partial semiautonomous control process is designed,and the experimental platform is used to conduct partial semi-autonomous grasping experiments on these 11 objects,which verifies the working performance of the system.In this thesis,under the background of grasping operation of the space teleoperation robot,information of the target object,such as its characteristics and grasping state,is perceived based on the visual and tactile information.The corresponding grasping control strategy is designed to realize the partial semi-autonomy improvement of space teleoperation grasping.
Keywords/Search Tags:space teleoperation, dexterous hand grasping, visual and tactile perception, neural network, partial semi-autonomous
PDF Full Text Request
Related items