Font Size: a A A

On-line estimation of visual-motor models for robot control and visual simulation

Posted on:1999-12-03Degree:Ph.DType:Dissertation
University:The University of RochesterCandidate:Jagersand, Martin UlfFull Text:PDF
GTID:1468390014969953Subject:Computer Science
Abstract/Summary:
We present a new integrated approach for combined visual model acquisition, visual servo control, and image based visual simulation. The approach differs from previous work in that a full coupled Jacobian is estimated on-line without any prior models, special calibration movements, or the use of absolute world coordinate systems. A collection of these Jacobians forms a sparse piecewise linear model of the underlying visual-motor function A trust region controller allows stable and convergent control when the underlying visual-motor model is highly non-linear, and determines the mesh size on which the model is estimated. On the high level, actions and tasks are coded in terms of desired general perceptions rather than motor sequences. We argue that our vision space approach is particularly suited for easy teaching/programming of a robot. For instance a task can be taught by supplying an image sequence illustrating it, or in teleassistance, by a human visually pointing out the desired manipulations. The resulting robot behavior is robust to changes in the environment, dynamically adjusting the motor control rules in response to environmental variation. We provide an extensive experimental evaluation of the positioning precision and convergence of the visual servoing. On the task (high) level we show how to use the system to solve complex real world manipulation tasks without needing any a-priori calibration of either camera or robot manipulator, or knowledge of the geometric relationships between the cameras, robot, and the manipulated objects.
Keywords/Search Tags:Robot, Visual, Model
Related items