| Nowadays,as the number of different digital equipments in our life increases,more natural and efficient human-machine interfaces(HMI)are required to improve the operability.Recent researches have showed that using the human cognitive Information can be one of the solutions to construct the natural HMI systems.Eye movement is one kind of the available human cognitive information,which can reflect the change of people’s attention.It has a close relationship with people’s concerns,inner emotions,and task intentions.Therefore,eye movement is an ideal input channel to construct convenient and natural HMI.Based on the eye movement information,this paper established two types of non-desktop HMI systems for different application scenarios.According to the different controlled tasks,these two systems can be divided into static object control system based on fixation point detection and dynamic object control system based on eye gesture recognition.Static object control system based on fixation point detection.In this system,we incorporated the vision-based object recognition method into the traditional fixation-based HMI.A double selection was designed to solve the Midas Touch problem.Using the dwell time-based approach,a possible input fixation could be selected and the corresponding ROI(the region of interest)could be extracted.Object recognition based on shape matching could avoid errors caused by noninput fixations and recognize the fixed object.The system was applied to control a real air condition.Four functions of switch,mode,heating and cooling could be controlled by gazing four different icons executed by relay control module.Dynamic object control system based on eye gesture recognition.Firstly,the relation between eye gestures and pupil center trajectory was analyzed.Eye gestures included voluntary blink,fixation and slow saccades(eye moving up,down,left,and right).Then,the judgment condition for voluntary blink was obtained,and the slopes of x and y were identified as classification features.The slopes were calculated by fitting a straight line using RANSAC algorithm.By using the AdaBoost algorithm,the classification method was realized by four binary classifiers.To improve the classification method,SAMME algorithm was selected to resolve the five-class classification problem.The actual controlled object was an omnidirectional wheelchair.After eye gestures were recognized,subsystem generated control commands to drive the wheelchair to implement functions of moving forward,backward,left,right and stop.Overall,this paper made a deep study on the three problems of HMI system based on eye movement,including:1)how to obtain eye movement information,2)how to extract features for classification and 3)how to use the classification results to control the external devices.Through a lot of experiments,the reasonable and efficient solution was explored.the result showed that it is feasible and accurate to use eye movement information for HMI.The study was a good attempt to apply this new nature HMI technique in daily practice,especially for the disabled. |