Font Size: a A A

A daisy-chaining approach for vision-based control and estimation

Posted on:2010-06-05Degree:Ph.DType:Dissertation
University:University of FloridaCandidate:Mehta, Siddhartha SFull Text:PDF
GTID:1448390002979129Subject:Engineering
Abstract/Summary:
The research presented in this dissertation monograph lies within the general scope of guidance, navigation, and control of autonomous systems and centers around the design and analysis of visual servo control strategies and vision-based robust position and orientation (i.e., pose) estimation. The motivation behind the presented research is to enable a vision system to provide robust navigation and control of autonomous agents operating over a large area. In order to enable vision systems to provide pose estimates over a large area, a new daisy-chaining method is developed.;The accuracy of any vision-based control and estimation problem largely depends on accurate feature point information. Feature point errors will result in an erroneous pose estimation that could potentially affect the stability and performance of the control and estimation methods. An accurate pose estimation is a non-trivial problem, especially when real-time requirements prohibit computationally complex algorithms. Chapter 2 illustrates a novel method, PEGUS, for estimating the relative pose between two images captured by a calibrated camera. The method, based on the statistical theory, utilizes redundant feature points in the captured images to develop a robust pose estimate. Experimental results indicate markedly better performance over existing popular methods such as RANSAC and nonlinear mean shift algorithm, and the non-iterative structure of the algorithm makes it suitable in real-time applications.;Control of a moving object using a stationary camera and vice versa are well attended problems in the literature of visual servo control and various solutions exist for a class of autonomous systems. However, control of a moving object using the image feedback from a moving camera has been a well-known problem due to the unknown relative velocity associated with moving camera and moving object. In Chapter 3, a collaborative visual servo controller, the daisy-chaining method, is developed with an objective to regulate a sensor-less unmanned ground vehicle (UGV) to a desired pose utilizing the feedback from a moving airborne monocular camera system. Multi-view photogrammetric methods are used to develop relationships between different camera frames and UGV coordinate systems, and Lyapunov-based methods are used to prove asymptotic regulation of an UGV.;Another technical challenge when using a vision system for autonomous systems is that the given feature points can leave the camera FOV. In order to address the issue of features leaving the FOV an extension of the method developed in Chapter 3 is provided by considering multiple reseeding feature points. The presented multi-reference daisy-chaining scheme enables the UGV/camera pair to operate over an arbitrarily large area. Simulation results are provided that illustrate the performance of the developed cooperative control scheme.;Building on the results in Chapter 3, the complex problem of cooperative visual servo tracking control is formulated in Chapter 4 with an objective to enable an UGV to follow a desired trajectory encoded as a sequence of images utilizing the image feedback from a moving airborne monocular camera system. The association as well as the relative velocity problem is addressed by introducing a daisy-chaining structure to link a series of projective homographies and expressing them in a constant reference frame. An adaptive parameter update law is employed to actively compensate for the lack of object model and depth measurements. Based on the open-loop error system, a tracking control law is developed through the application of Extended Barbalat's lemma in the Lyapunov-based framework to yield an asymptotic stability. The tracking results are extended to include reseeding stationary feature points by formulating additional projective homography relationships to provide an unrestricted applicative area for the UGV/camera pair. Simulation results are provided demonstrating the tracking control of an UGV in presence of multiple stationary reference objects and visual simultaneous localization and mapping (vSLAM) results are achieved by fusing the daisy-chaining method with the geometric reconstruction scheme. (Abstract shortened by UMI.)...
Keywords/Search Tags:Daisy-chaining, Estimation, Autonomous systems, Results, UGV, Vision-based, Visual servo, Feature points
Related items