The human eye's state of motion and content of interest can express people's cognitive status and emotional status based on their situation.When observing the surrounding things,the human eyes make different eye movements according to the observed objects,which reflects human's attention and interest intentions.In this paper,we capture and analyze patterns of human eye-gaze and head motion and classify them into different categories.And,we compute and train the eye-object movement attention model and eye-object feature preference model based on different peoples' eye-gaze and head patterns by using machine learning algorithms.These models are used to predict humans' object of interest and the intended interaction objects according to people's real-time situation.Furthermore,the eye-gaze and head motion patterns can be used together as a modal of non-verbal information in the computing of human emotional states with PAD emotion model.Our methodology analyzes human emotion and cognition from the aspect of eye-gaze and head motion,understands the cognitive information that human eyes can express,and effectively improve the efficiency of human-computer interaction in different circumstances. |