Font Size: a A A

Research On Intelligent Assembly Technology Based On Human-Computer Interaction System

Posted on:2020-11-21Degree:MasterType:Thesis
Country:ChinaCandidate:H QiFull Text:PDF
GTID:2518306350476654Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
The programming of robots has always been a problem solved by academic and industrial circles at home and abroad.Classic programming methods include teach-in programming,offline programming,boot programming,and remote programming.These programming methods have achieved many applications in actual production and research because of their unique advantages.At present,new application requirements are increasingly requiring the intelligence of robots,especially in the demand for small-volume,short-cycle products,and the application fields are more extensive.This is a new challenge for robot programming,correction,and cognitive capabilities.Therefore,demo programming has become the biggest hot spot,but many studies directly translate human demos into robot programs.Due to the complexity of the assembly process and the uncertainty of the human demonstration,direct programming of the robot is often difficult,especially for small workpieces in the assembly process,because the assembly errors of small workpieces are small and the assembly process is dully.This is one of the reasons why direct demonstration programming has not been widely used in industry.This theis is to extract the human assembly skills first,so that the robot can imitate the assembly skills of the people to automatically program and automatically optimize the assembly process,which is an innovative research.This theis is mainly based on the traditional demonstration programming.In order to solve the problem that the existing robot assembly learning process is complex and the programming technology requirements are high,an implicit interaction method based on the forearm surface EMG signal and inertial multi-source information fusion is proposed to realize the robot,in demonstrate programming methods.Firstly,through the MYO bracelet and data gloves,the multi-source information such as sEMG information,inertial information and finger bending angle during the single-step assembly process of the demonstrator is collected;Secondly the assembly action is recognized by the assembly gesture recognition algorithm to make the robot learn the assembly process;Based on the demonician's assembly experience,a deep deterministic strategy gradient algorithm(DDPG)was applied to modify the assembly parameters to improve the adaptability to assembly objects and environmental changes;Based on the demonstration programming,intensive learning ensures that the robot performs its tasks stably.Finally,the collaborative robot is reproduced in the assembly process.What's more,under the external environment interference such as camera position change and relative position change of the shaft hole,the deep reinforcement learning algorithm is used to optimize the assembly process and improve the robustness of the assembly reproduction process.In the collaborative robot demonstration programming,in the view of the data characteristics of the multi-source information extracted during the single-step assembly process of the demonstrator.The inertial information and the EMG information feature are automatically extracted by the one-dimensional convolution and pooling process,which enhances rationality and accuracy of the pan-feature recognition.The assembly gesture recognition network based on Alexnet's improved assembly gesture recognition network and the migration learning method based on the sign language recognition network structure(based on VGGNet improvement)are separately verified.The above two classification methods are better than the traditional machine learning algorithm(SVM).The result of a recognition rate higher than 3%.In addition,for the online recognition process,the twin neural network is introduced as a verification model,which improves the online recognition of gestures.The application effect of the omnidirectional mobile platform robot's online gesture control experiment was verified.In the demonstration and reproduction process optimization experiment of cooperative robots,two kinds of reinforcement learning environment construction schemes are designed.The first scheme is based on the fusion tracking algorithm of frame difference method and multi-feature map core correlation filter(MKCF)for the PrimeSense 3D camera to obtain the environmental changes in the X-axis and Y-axis directions as states respectively.The second scheme is based on the training of coding and decoding for the Axle-Hole pictures using VAE algorithm and taking the coding results as states.Both schemes adopt DDPG network structure for deep reinforcement learning of continuous process.Under the interference of external environment such as camera position change and axis hole relative position change,the manipulator can automatically adjust the end position of the manipulator,according to the generalized strategy model obtained by reinforcement learning to realize the demonstration learning of axis hole assembly.This theis presents an intelligent assembly technology based on demonstration programming.The sEMG and inertia information are extracted from the natural assembly process of the demonstrator through MYO and data glove wearable equipment,and the assembly gesture ofthe demonstrator is recognized by deep learning algorithm,so that the robot can learn the assembly process.Finally,the disturbance of the environment in the reproduction process is adjusted by deep reinforcement learning,which enhances the robustness of the system.
Keywords/Search Tags:Intelligent assembly, Demonstration programming, Gesture recognition, sEMG, Reinforcement learning, Deep learning
PDF Full Text Request
Related items