Font Size: a A A

A real-time system for detecting and tracking people in video

Posted on:2002-04-04Degree:Ph.DType:Thesis
University:University of Maryland College ParkCandidate:Chalidabhongse, Thanarat HorprasertFull Text:PDF
GTID:2468390011996902Subject:Computer Science
Abstract/Summary:
The ability to detect, track and recognize people in action is a fundamental and crucial task in many vision systems. This thesis presents real-time algorithms for detecting, tracking, and localizing human motion in video sequences.; We present two algorithms for background modeling and background subtraction used in detecting and extracting moving objects from video sequences taking from static cameras. The first algorithm, a statistical-based approach (S-BGS), is a robust and efficiently computed method that is able to cope with illumination changes, such as shadows and highlights. The algorithm is based on a computational color model which separates the brightness from the chromaticity component. The second algorithm, a codebook-based approach (CB-BGS), is designed based on a quantization technique. The underlying motivation of this method is to be able to work with low bandwidth compressed video, for which the S-BGS algorithm does not work well. The method also has ability to adaptively model possible moving backgrounds over a long period of time. The results of both algorithms and a newly developed performance evaluation method called Perturbation Detection Rate (PDR) Analysis are presented.; We also describe a real-time vision-based motion capture system for detecting and tracking human movement in 3-D using multi-cameras. Multiple cameras observe a person; silhouette analysis, multi-cue tracking, and triangulation achieve real-time 3-D estimation of human posture. An early prototype of the system was demonstrated at SIGGRAPH98. The system is a PC-based computer vision system that can track human movement and provide a person with control over the movement of a virtual puppet in real-time.; Finally, we present an approach for estimating 3-D head orientation in a monocular image sequence. The approach employs image-based parameterized tracking for face and face features to locate the area in which an estimation of point feature locations is performed. We describe an approach that relies on the coarse structure of the face to compute orientation relative to the camera plane. Our approach employs projective invariance of the cross-ratios of the eye corners and anthropometrical statistics to estimate the head pitch, yaw and roll. Analytical and experimental results are reported.
Keywords/Search Tags:System, Real-time, Tracking, Detecting, Video
Related items