Font Size: a A A

Fusion of imaging and inertial sensors for navigation

Posted on:2007-02-01Degree:Ph.DType:Dissertation
University:Air Force Institute of TechnologyCandidate:Veth, Michael JFull Text:PDF
GTID:1448390005977855Subject:Engineering
Abstract/Summary:
The introduction of the Global Positioning System changed the way the United States Air Force fights, by delivering world-wide, precision navigation capability to even the smallest platforms. Unfortunately, the Global Positioning System signal is not available in all combat environments (e.g., under tree cover, indoors, or underground). Thus, operations in these environments are limited to non-precision tactics. The motivation of this research is to address the limitations of the current precision navigation methods by fusing imaging and inertial systems, which is inspired by observing the navigation capabilities of animals. The research begins by rigorously describing the imaging and navigation problem and developing practical models of the sensors, then presenting a transformation technique to detect features within an image. Given a set of features, a rigorous, statistical feature projection technique is developed which utilizes inertial measurements to predict vectors in the feature space between images. This coupling of the imaging and inertial sensors at a deep level is then used to aid the statistical feature matching function. The feature matches and inertial measurements are then used to estimate the navigation trajectory online using an extended Kalman filter. After accomplishing a proper calibration, the image-aided inertial navigation algorithm is then tested using a combination of simulation and ground tests using both tactical and consumer-grade inertial sensors. While limitations of the extended Kalman filter are identified, the experimental results demonstrate a navigation performance improvement of at least two orders of magnitude over the respective inertial-only solutions.
Keywords/Search Tags:Inertial, Navigation
Related items