Font Size: a A A

Design Of Mechanical Arm System Based On Facial Expression Driving

Posted on:2022-01-02Degree:MasterType:Thesis
Country:ChinaCandidate:S Y ZhanFull Text:PDF
GTID:2518306317990159Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of robot technology,the handicapped robot plays an increasingly important role in the life of disabled people.In order to solve the problem of difficulty in self-care of disabled people with loss of hands or unconsciousness,a human-computer interaction method of virtual mechanical arm driven by facial expression is constructed through deep learning algorithm,According to the change of facial expression,the robot arm is controlled to complete the corresponding action to help these people solve the problem of selfcare in future life,This kind of human-computer interaction method has good theoretical research significance and practical application value.Firstly,based on the research of literature,the overall scheme of the mechanical arm system based on expression driving is designed.The facial expression is collected by camera,and the deep learning algorithm is used to recognize and classify the facial expression accurately.The expression command system is established,and the corresponding command action is completed by virtual mechanical arm.Secondly,the facial expression data is collected,and the data set is expanded by using data enhancement techniques such as random clipping and mirror image,and then the facial expression data set is established after labeling.Aiming at the problem that the Faster R-CNN(Faster Region Convolutional Neural Network)backbone network has fewer layers and insufficient deep image feature extraction,the Faster R-CNN network is improved,and the deep residual network Res Net-50 is used to replace the original network to extract facial expression features to prevent facial expressions from being importa nt Feature loss.The feature pyramid is constructed for different levels of feature maps to enhance the breadth of feature extraction and effectively i mprove the detection of face details.The size of anchor in the detection network is changed,and the training model is optimized to improve the accuracy of facial expression recognition.Finally,use Solid Works to create a virtual mechanical arm model,through3 DMax to make model animation and import it into Unity3 D,the resources of the mechanical arm model are integrated and optimized,and the Unity3 D virtual experiment platform is built.The improved Faster R-CNN facial expression test results are used as control commands of virtual mechanical arm,which are transmitted to the virtual mechanical arm simulation platform through socket interface.At the same time,it is designed to display the results of facial expression recognition and the GUI visualization interface of the Unity3 D virtual mechanical arm to realize the control of the virtual mechanical arm.The experimental results show that the virtual robot arm will perform correspondin g actions such as extending,grasping,and contracting by giving instructions to the system through expressions,which verifies the feasibility of the human-computer interaction method.
Keywords/Search Tags:Human-computer interaction, Facial expression recognition, Faster R-CNN, Virtual simulation
PDF Full Text Request
Related items