Font Size: a A A

Research And Implementation Of Eye-moving Human-machine Control Interface Technology

Posted on:2021-04-10Degree:MasterType:Thesis
Country:ChinaCandidate:C LeiFull Text:PDF
GTID:2370330602495156Subject:Engineering
Abstract/Summary:PDF Full Text Request
With the development of society and the innovation of science and technology,the two interdisciplinary fields of computer science and biomedicine have gradually become the main research directions in the field of scientific research.Among them,eye-tracking human-machine control interface technology has become a hot topic for discussion and research.Eye tracking is a key technology in the eye-tracking human-machine control interface technology.This technology is commonly used in eye-tracking input methods,virtual reality,web page analysis,fatigue driving,military technology,and medical assistance and medical disease research.At present,in the eye-tracking human-machine control interface technology,there are two main problems in gaze tracking.One is the need for expensive wearable devices and additional infrared light sources.The second is that the head movement has a great influence on the sight prediction.In view of the current problems,this paper proposes to use a common camera under natural light,by collecting the features of the face and the human eye,and using neural network technology to establish a sight mapping model between the human eye and the device screen.This article mainly studies the following:(1)Extract feature values of face and eyes.In this paper,the Haar feature and Adaboost algorithm are used to detect and locate the human face,and then the facial feature points are extracted by the ASM algorithm,thereby obtaining the inner corner points,nose and mouth features of the two eyes.On the basis of acquiring the human face area,the human eye interest area is extracted according to the specific distribution position of the human eye on the human face.Finally,digital image processing technology is used to process the human eye interest,so as to extract pupil features.(2)Establish a human-machine sight model.Set the calibration point on the device screen,then use your eyes to see the location of the different calibration points,and then record and analyze the data predicted by the gaze on the eye movement model.When the head does not move,the vector parameters of the center of the pupil and the features of the corner points in the eye are substituted into the static mapping model to estimate the sight point.However,when the head is not constrained,the estimated effect of this model is very poor.In order to make up for the viewpoint error caused by head movement,this paper proposes to use neural network technology to establish a sight angle compensation model.The changes of the feature points of different parts of the face indirectly reflect the head movement information.Therefore,in this paper,the face features are used as the input data of the neural network,and the viewpoint error is used as the output data.Through continuous training of neural network models,a dynamic line-of-sight compensation model is established.Through experimental comparison,the accuracy of line-of-sight estimation has been greatly improved.(3)Eye movement control application.Combining the line-of-sight estimation model with the mouse,using line-of-sight point estimation instead of mouse movement,and blinking instead of mouse click events,this paper proposes a new blinking algorithm based on human eye feature points and applies it to public data Set the test and get good results through experimental verification.Finally,this technology is applied to different software,thus confirming the effectiveness of this method.
Keywords/Search Tags:Gaze tracking, eye-tracking human-machine controls, gaze calibration, human-computer interaction, neural network
PDF Full Text Request
Related items