Font Size: a A A

Visual Tracking Based On Subspace Motion Model

Posted on:2014-01-04Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2248330398961418Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
In solving the tracking problem of dynamic object integration, traditional augmented reality technique, e.g. camera calibration and3D reconstruction, often fail when their required real world information cannot be satisfied, and when their computational expense is too high to meet the real-time computing. The visual tracking technique of computer vision is a potential solution to solve this problem, with its low computational cost and easy-satisfied calculation requirements. By tracking online, user could locate the moving object, guide its behavior and provide the information needed in the interaction between virtual and real objects.Visual tracking, in general, is a very challenging problem due to the loss of information caused by projection of the3D world to a2D image, the changing of object’s texture and geometrical appearance, partial or full occlusions, noise in images as well as real-time processing requirements. A robust tracking system means a long-time and steady online tracking of moving objects in video sequences. With the development of theory and skills, visual tracking has been formulated as a problem of Bayesian inference in state space.A complete tracking system is mainly made by two parts, motion model and appearance model. Most modern tracking algorithms adopt a motion model of particle filter on the affine space. This method describes objects’motion in2D image warping, which is not the real situation of a3D space motion, so the distribution of particles can’t match the potential object area very well. To describe object’s motion properly, a projection relation from3D world to2D images needs to be established. The subspace theory of the appearance trajectory of a rigid object in a video sequence can build this relation and provide a particle filter space matching object’s motion in the3D world. According to this theory, object’s complete trajectory lies on a subspace with a certain low rank. Based on this theory, we may build a motion model for visual tracking, which records object’s motion trajectory during a short period, then decomposes the trajectory matrix and does particle filter in the camera-dependent vector. This motion model’s description of objects’ motion can match the rule of the3D world much better, which leads to a more even distribution of particles.Meanwhile, based on the motion model, we may build a complete tracking process. We describe object’s appearance by extracting feature points in a rectangle template. This description is convenient to compute and may avoid the background corrosion quite well. By matching the particles and templates with NCC and FB error in a weighted estimation, as well as a further calculation of template position, the algorithm may reach to the tracking result. By testing the algorithm in six challenging video sequence and comparing with results from other popular tracking methods, we found the tracking algorithm based on the subspace motion model may reach to more stable results.
Keywords/Search Tags:visual tracking, particle filtering, motion model, subspace theory
PDF Full Text Request
Related items