Font Size: a A A

Intention Recognition And Motion Prediction Of Human Upper Limb In Assembled Human-Robot Collaboration And Its Application

Posted on:2023-03-15Degree:MasterType:Thesis
Country:ChinaCandidate:M C DongFull Text:PDF
GTID:2558306623468904Subject:Control engineering
Abstract/Summary:PDF Full Text Request
With the transformation of industrial manufacturing from large-scale and mass production to customized and intelligent production,the human-robot collaboration(HRC)is the trend and direction of future industrial production mode.In order to ensure the safety and efficiency of HRC,collaborative robots need to have the ability to perceive and understand human behavior information in the workspace.In this thesis,human behavior understanding in HRC is carried out by motion intention recognition and trajectory prediction of human upper limb.Intention recognition of human behavior can ensure the robots actively provide collaboration with humans and improve work efficiency.Trajectory prediction can ensure that the robot maintains a safe distance from the human body during the collaborative process and avoids the safety accidents.The main research contents are as follows:(1)Aiming at the flexible and changeable collaboration scenarios and methods in HRC,an experimental platform is constructed with the production operation of assembly nature as the research and application background.In the collaboration process,5 kinds of human upper limb motion intentions are designed.Vicon motion capture device is used to collect the human upper limb motion data and form a data set in 3-D space,which provides the original data for subsequent research on human behavior understanding.(2)For the human upper limb motion intention recognition in HRC,a projectionreconstruction mapping method is proposed in Chapter 3.The time-series coordinate information of human upper limb motion in 3-D space is converted into the feature image information in 2-D plane,which implies the spatial and temporal continuity of motion information.A dataset of human upper limb motion feature images is established,and the recognition of human upper limb motion intent is converted into the recognition of motion feature images.The AlexNet is used to obtain the motion intention recognition model of human upper limbs by transfer learning.Through the recognition of motion feature images,the accurate motion intentions are recognized in the early stage of human upper limb motion.The experimental results show that the correct rate of intention recognition of the method can reach 74%and 100%as the action completion degrees are 20%and 30%,respectively.(3)For the prediction of human upper limb motion trajectory in HRC,the deep learning method is combined with the echo state network learning mechanism,and a Deep Belief Echo State Network(DBESN)is constructed by containing several restricted Boltzmann machines and an echo state network regression layer.The DBESN achieves prominent storage capacity and excellent nonlinear approximation ability.The down-sampled human upper limb motion trajectory data sets are used to establish a prediction model,and the prediction of human upper limb motion trajectory can be then realized with a cyclic multi-step prediction method.Trajectory prediction models are compared and analyzed in terms of prediction error,calculation time.The experimental results show that the proposed method achieves the acceptable error of human upper limb motion trajectory prediction within 2-4cm,and the average calculation time is only 40.5ms,which meets the requirements of the HRC system for prediction accuracy and real-time performance.Based on the robot operating system,in this thesis,an assembled HRC real-time system is built to further verify the effectiveness of the proposed methods,which consists of an intent recognition module,a trajectory prediction module,and a motion control module.
Keywords/Search Tags:Human-robot collaboration, Human behavior understanding, Intent recognition, Trajectory prediction, Transfer learning
PDF Full Text Request
Related items