| In order to create intelligent robots that are able to react to their environment through computer vision, it has been of interest to study how humans, and animals receive and process visual information. Flying animals, such as birds and bats, use a vision processing technique called optical flow to navigate the environment. The key to making use of optical flow for feedback control is the idea of time-to-transit, which is a measure of how long it will take an observer to pass an object in its field of view. Simply using optical flow data, this time-to-transit (tau) can be calculated without knowing the distance to the object, or the size of the object itself. Tau can be computed in real time and used as input to autonomous vehicle control laws. Vision-based navigation of autonomous robotic vehicles can support applications in both the military and civilian sectors.;In this work, a series of feedback control laws for autonomous robot control, whose inputs are the frames of a video sequence in real time, are developed. Two control laws, coined motion primitives, are developed based on tau balancing and tau difference maximizing, and protocol switching logic is established to determine when each should be employed. The tau balancing law utilizes information on both the right and left sides of the path environment, when available, and attempts to balance between them. The tau difference maximizing primitive, contrastingly, aligns the vehicle motion with features either on one side or the other. A tertiary navigation strategy is also implemented where the segments of sensing, perceiving, and acting are separated. A simulation environment is also developed as a test-bed for studying the effects of changing control law parameters and decision variables for protocol switches.;In some cases, it may appear as though one strategy can be used, when the other is actually required. Such situations are referred to as occurrences of perceptual aliasing---the misinterpretation of perceptual cues, leading to the execution of an unsuitable action. Such misunderstanding of the environment can lead to dangerous motions of the vehicle---as would occur when the control attempts to steer the vehicle between features on the left and right sides of a solid obstacle or wall in the vehicle's path. Without safeguards in place to prevent this misinterpretation, perceptual aliasing could cause a robot to collide with obstacles in its environment. Perceptual aliasing can occur whenever the most intuitive control strategy will not result in successful navigation. The problem is overcome through studies of human and animal perception, as well as a statistical analysis of the structure of optical flow and time-to-transit, to intelligently select which control strategy to implement. These control laws are composed together to allow a robot to autonomously navigate a corridor environment with both straight and turning sections. |