Font Size: a A A

Research On Robot Control System Based On Kinect Depth Information Gesture Recognition

Posted on:2021-09-22Degree:MasterType:Thesis
Country:ChinaCandidate:J FangFull Text:PDF
GTID:2518306197455534Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
The development and application of human-computer interaction technology industry are changing with each passing day.Gesture recognition is the essential interactive mode,Which has the advantages of nature and convenience,which is more suitable for the practical application of various scenarios.Gesture recognition converts the user's gestures into corresponding commands to control the computer,et.al.The research combines the multidisciplinary knowledge of image processing,speech recognition,sensor and so on.With its rich and vivid facial expressions and various interesting voice functions,facial robot have a significant advantage in education,can interact with different groups of students better,making human-computer interaction more direct,more convenient and more humanized.This paper designs a new type of mobile educational robot-vehicle-mounted emoticon robot,The application of gesture recognition in vehicle-mounted emoticon robot is a new research and has a broad application prospect.In this paper,Kinect sensor depth information gesture recognition is used to design the robot system on the body of the tracked vehicle.The Kinect sensor gets the hand motion information,communicates with the computer via USB,and sends the computer-processed hand data to the Arduino master control board to control the robot.The voice system and mobile APP system control system are imported into the facial robot to increase the control mode to meet the needs of different groups.The main tasks are as follows:(1)Gesture recognition model design.The necessary gesture of human-computer interaction was designed,and the Kinect sensor OpenNI and NITE were used to recognize and track the hand part,and the palm point principle was obtained.Different gestures adopted the gesture judgment method based on speed threshold and Angle variable detection method to complete the control gesture of the sensor.(2)System programming.Dynamic gesture recognition program,mobile APP control program,voice interaction and expression control program were designed,and two modules of mobile APP control design were designed,including software program design of Android platform and Arduino server receiving program.(3)Kinematic analysis of tracked vehicle body.The chassis structure of the crawler vehicle-mounted emoticon robot is driven by differential motion.The two-dimensional motion model of the robot on the horizontal plane is established,analyze the motion trajectories of three kinds of vehicle bodies.(4)Vehicle-mounted facial robot verification system.The design and development of intelligent robot entity system,the control core of the system is Arduino chip,The WIFI module is adopted as the communication data transmission system of Android mobile phone APP.The structure is designed as a crawler-mounted emoticon robot.The design of facial expression module mainly focuses on the study of expression control mode and expression harmony.Experimental results show that the success rate of gesture recognition test is higher than 95%,mobile App platform and voice control module can be achieved within a reasonable range,system indicators and accuracy meet the test requirements,can well achieve accurate control of the robot,can meet the needs of human-computer interaction and application.
Keywords/Search Tags:Gesture recognition, Voice control, Mobile phone app, Tracked vehicle, Facial expressions
PDF Full Text Request
Related items