Font Size: a A A

Research On Human-robot Interaction Intention Inference Based On Prediction Of Upper Limb Motion

Posted on:2019-05-31Degree:MasterType:Thesis
Country:ChinaCandidate:G LiuFull Text:PDF
GTID:2428330569478575Subject:Mechanical engineering
Abstract/Summary:PDF Full Text Request
As an important function of human-robot interaction,intention inference is increasingly applied in the field of service robots.To predict and identify the operator's intentions by natural human-robot interaction is of great significance to improve human-robot interaction efficiency.The traditional human-robot interaction method is often characterized with high learning cost,complex interaction,low interaction efficiency and single interactive information.Natural,harmonious and humanized human-robot interaction is becoming a new direction for the development of human-robot interaction technology.Because of the single and fuzzy interactive information,it is difficult to identify the human-robot interaction intention by those human-robot interaction methods based on single interactive information such as gestures,voice,expression and sight.To improve interaction efficiency by synthesizing the multidimensional interaction information is of great significance to the development of human-robot interaction technology.In this paper,a human-robot interactive intention inference method based on the expression factor of the motion intention of the human upper limb is proposed based on the user interaction scenarios and motion information captured by a somatosensory device combined with the color camera and the depth camera.Research of this paper is conducted from the following aspects:Firstly,we analyze the imaging principle of color camera and depth camera,and get the color and depth images(RGB-D images)with human-robot interaction information,and the coordinate mapping and fusion of deep and color images are conducted.At the same time,the coordinate mapping of RGB-D image fusion process is used to extract the location of goal with marker.Secondly,the kinematic characteristics of the human upper limb and the motion characteristics of the human upper limb during the contact movement are analyzed,and the upper limb motion trajectory prediction method is proposed based on the back propagation(BP)artificial neural network.The upper limb contact trajectory is divided into learning samples and prediction results.The neural network learning period being finished,the potential trajectory of the arm in the posterior segment is predicted based on partial arm movement trajectory.A human-robot interaction intention expression factor based on the movement of human upper limbs related with motion trajectory is proposed.According to the relationship intensity of intention expression factor and intention expression,the probability distribution of each intention expression factor is mapped.An intention inference algorithm based on D-S evidence theory is proposed,the real time evidence is synthesized for each intention expression factor,the intention inference based on the fusion probability of D-S evidence theory is realized.Finally,experiments are conducted to verify the proposed D-S evidence theory intention inference algorithm.Experimental data under different users and evidence fusion cycle are collected and analyzed.The experimental results indicate that the algorithm proposed can predict the motion trajectory of the user's arm reasonably,and infer the user's interactive intention accurately according to the intention inference algorithm,the effectiveness and adaptability of the algorithm are verified.The results of evidence fusion periodic variation experiments show that multi period evidence fusion can improve the stability and accuracy of the intention inference algorithm.Finally,a cooperative grasping experiment which apply the algorithm to the cooperative robot Baxter platform is produced,and the validity of the intention inference algorithm based on the D-S evidence theory is further proved.
Keywords/Search Tags:Intention inference, D-S evidence theory, Human-robot interaction, Service robot
PDF Full Text Request
Related items