Font Size: a A A

Research On Monocular Visual-inertial Odometry

Posted on:2022-08-27Degree:MasterType:Thesis
Country:ChinaCandidate:Q S WeiFull Text:PDF
GTID:2518306470459154Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
With the development of electronic positioning and navigation,more and more researchers are striving to find a stable and accurate positioning and navigation method for robotics,autonomous driving,aircraft,and AR&VR and other industries.There are many solutions,such as lidar,GPS,etc.,but they have certain shortcomings,such as high cost or certain requirements on terrain and signal transmission methods.A navigation and positioning method for object positioning and navigation multi-sensor fusion—Visual Inertial Odometer(VIO).This subject proposes a visual inertial sensor system,which aims to make it easy to deploy on robots,aircraft and other targets that need positioning and navigation,so that the target has powerful real-time localization and mapping(SLAM)capabilities,and helps Reduce the difficulty of studying SLAM related algorithms.Through a camera connected to the ARM+FPGA master control,and an IMU,it provides high-quality gyroscope and accelerometer measurement and calibration,and synchronizes with the image on the hardware,so as to achieve the robustness and accuracy that a single-vision SLAM system is difficult to achieve.In addition to the original data,the system will also provide data pre-processed using FPGA,such as image key point detection,contrast enhancement,etc.It can significantly reduce the computational complexity of the SLAM algorithm and make it available on resourceconstrained platforms.It solves the problem that most of the current visual inertial odometers directly use the independent module of the USB interface,which causes a large amount of algorithm calculations and requires PC/GPU hardware.This article focuses on analyzing ZYNQ FPGA to collect camera data,ZYNQ ARM to collect IMU data,ZYNQ ARM to read image data without operating system,there are faults in reading image data,MT9V034 camera Bayer data format,ZYNQ internal AXI?HP AXI?GP bus interface data protocol,ZYNQ chip multi-core software and hardware Collaborative work and analyze the detailed process of transferring data from the camera to the Linux solution display to ensure that the data is true and effective,and real-time synchronization for follow-up use sensor fusion algorithm processing directly in ZYNQ Linux operating system.Finally,the system driver is written through ZYNQ Linux,and the USB high-speed interface is used to connect the upper computer robot operating system(ROS)to transmit data for visual display,and realize multi-cascade adjustment to ensure that the system is stable and feasible.
Keywords/Search Tags:Visual Inertial Odometer, Positioning and Navigation, Sensor Fusion, Real-time, Data Collection
PDF Full Text Request
Related items