Font Size: a A A

Automatic human behavior tracking and analysis

Posted on:2012-09-29Degree:Ph.DType:Dissertation
University:Rensselaer Polytechnic InstituteCandidate:Chen, JixuFull Text:PDF
GTID:1458390008993221Subject:Engineering
Abstract/Summary:
Automatic human behavior analysis is receiving increasing attention for a wide range of applications including Human Computer Interaction (HCI), medical diagnosis, security surveillance, games, and entertainment. Using video cameras together with computer vision technology to interpret and understand the human behaviors, video-based human sensing has the advantages of non-intrusiveness and naturalness.;Human usually performs different behaviors simultaneously, from the detailed eye movement representing his attention, facial and head movements, to the global body movement. To understand the human behaviors holistically, we develop computer vision techniques to track and analyzes various spontaneous human behaviors, namely, human body motion, facial motion and the eye gaze movement.;First, we develop algorithms to track natural and complex human body movements. Human body tracking is challenging due to the high dimensionality of the state space, ambiguous image measurements, and pose variations of different activities and subjects. In order to cope with the composite dynamics in high dimensional space, we first introduce a switching Gaussian Process Dynamic Model (SGPDM). SGPDM projects the body pose into low-dimensional latent sub-space and automatically switches the state to adjust this projection. This model, however, requires a large amount of training data and cannot generalize well on unseen body poses. To overcome these limitations, we introduce a knowledge-based body pose estimation method. Unlike the existing data-driven methods, our method exploits various constraints on natural body motion. We systematically identify and represent these constraints from the principles of physics, biomechanics, and anatomy. A body pose model is then learned from these constraints without using any training data. This body pose model can generalize well to various natural body activities.;Second, for eye gaze tracking, we propose a new probabilistic eye gaze tracking system without explicit personal calibration. Unlike the traditional eye gaze tracking method, which includes a personal calibration to deterministically estimate the personal eye parameters, our approach estimates the probability distributions of the eye parameters and eye gaze, based on combining image saliency with 3D eye gaze model. Through a new incremental learning algorithm, our gaze tracking system can automatically adapt to the user and improve his/her gaze estimation when he/she starts using the system naturally.;Third, for facial motion analysis, we introduce a new method that performs simultaneous facial feature tracking and facial action recognition. This is in contrast with the existing methods that typically perform facial feature tracking and facial action recognition separately. Our method employs a unified graphical model to capture the spatial and temporal relationships between feature points and facial expressions. During tracking and recognition, the captured relationships are combined with image measurements, resulting in improved performance in both facial feature tracking and facial expression recognition.;Compared to the existing human behavior tracking and analysis techniques, our techniques require fewer training data, achieve higher accuracy, provide more natural user interface, and can easily generalize to new users. These techniques are extensively tested under various situations such as different subjects, different body activities, different training and testing data sets, etc. The experimental study shows significant improvement of our techniques over the existing state-of-the-art techniques.
Keywords/Search Tags:Human, Tracking, Eye gaze, Techniques, Body pose, Data, Training, Existing
Related items