Font Size: a A A

Research On Spatiotemporal Saliency And Filter Algorithm Of Wearable Eye Tracking System

Posted on:2016-02-09Degree:MasterType:Thesis
Country:ChinaCandidate:H R ZengFull Text:PDF
GTID:2308330464456883Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Eye tracking is a technique which can estimate the line of sight and forecast the point of regard using mechanical, optical, electronic and other means by detecting human eye’s movement. This technology has a wide range of applications, such as auxiliary equipment for disabled person, interactive consumer electronics products, psychological and physiological studies and virtual reality application. In recent years, smart wearable devices have achieved rapid development. Application of intelligent glasses, watches, clothing and other forms of products gradually go into people’s lives and change the way of lives.Gaze tracking system is generally divided into two types of wearable and desktop from constituting. They can also be classified into two categories of model-based and appearance-based approach from the detection method. The model-based methods determine gaze direction by extracting the geometrical properties of human eye using dedicated hardware such as a plurality of synchronized cameras and infrared light sources. The appearance-based methods, by contrast, don’t require any special hardware, which only need the natural appearances of eyes observed from a commodity camera. A lot of different implementations have been proposed, such as Electro-Oculo Graphy(EOG), scleral contact lens/search coil, Photo-Oculo Graphy(POG) or Video-Oculo Graphy(VOG), and video-based combined pupil and corneal reflection.Most eye tracking methods have a problem that they need to take explicit calibration process for different users. Users of these systems is always actively involved in the calibration task, their eyes need to explicitly look at the reference point. Another problem is that most methods have calibration drift, which means their calibration accuracy is highly dependent on different users and installation settings. In many cases, such active calibration procedure has many limitations, because it interrupts the natural reaction and makes unconscious gaze estimation impossible. Although it is possible by using multiple light sources, stereo cameras, and other special hardware to reduce the number of individual calibration reference points, but it still requires users to actively participate in the calibration task, which increased restrictions on applied scenes.This paper proposes a gaze estimating method using visual saliency maps, which does not require active personal calibration process. Users only need to watch a video clip and a gaze estimator can be created. In this method, saliency maps of video frame are regarded as the gaze points’ probability distribution. By aggregating the saliency maps based on the similar eye images, we can effectively identify the point on the saliency map. We use Gaussian process regression method to establish the mapping between the eye and the gaze point. By using improved significantly map extraction algorithm and a feedback loop from the line of sight to gaze probability maps, we improved the accuracy of eye tracking. In addition, by optimizing the eye filtering algorithm we enhance the reliability and available scope of the system.
Keywords/Search Tags:Eye gaze tracking, Visual saliency, Blink detection, Passive calibration
PDF Full Text Request
Related items