Font Size: a A A

Research On SLAM/IMU Integrated Navigation System Based On Vision

Posted on:2022-04-22Degree:MasterType:Thesis
Country:ChinaCandidate:H X WangFull Text:PDF
GTID:2518306353979969Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
At the start of the appearing of computer vision,people envision that computer can understand the surrounding environment and explore the unknown areas like the human eyes.For computers however,they can only get the pixel matrix arranged by digital,how we identity the object from a pure numerical matrix likes faces,words,has become a attractive problem and researched by the the domestic and foreign exports and scholars continuously.Until decades later,artificial intelligence and machine learning technology is gradually applied to visual recognition,and achieved certain progress,during this period,the understanding of the vision navigation is gradually thorough.With the 30 years of the development of SLAM nearly,people can gradually compute the position of carrier using camera,and various kinds of real-time SLAM system has been built,which realize task of fast position track and real-time 3D reconstruction,and a large number of products related to SLAM emerged,such as interior sweeping robot,automatic driving,virtual reality and augmented reality equipment,etc.Since the 21st century,vision SLAM technology centered on vision sensors has experienced significant changes and breakthroughs in theory and practice,and is gradually moving from laboratory to market applications.This paper focus on building a SLAM system by using the method of IMU fusion with monocular camera,and researches from the perspective of hardware device construction and algorithm flow.First of all,introduce the SLAM realization platform and development environment,a data collection system is designed with ARM+GPU,the construction of data link through USB port,serial port and TCP/IP port.The hardware system selects GPU as a data collection host,and with data collection software based on ROS and Qt running it on,and to achieve good visual effect,we develop a collection interface program using Qt4,and ARM is used as a control plate of IMU data acquisition and motor control,and the camera is driven on GPU.After the image and IMU data are obtained,a visual odometer is firs constructed based on the feature point and optical flow tracking method,and the measured data are verified on the hardware platform,the phenomenon of trajectory drift during the pure inertial navigation is verified through simulation and actual data.Aiming at the existing problem of simple visual front-end,we combined the system with inertial information.As a result of the IMU data has much higher collection speed than the camera,IMU data is integrated using middle point integration between the frame of camera,and avoiding the failure of pose estimation caused by fast moving of the carrier.Further more,the bag of words model is introduced,and has tested on the simple use of DBo W3close-loop testing experiment.In the joint visual/IMU pose optimization,the system is initialized jointly with the rotation and translation constraints between the body frame and the camera frame,and the initial velocity,position,attitude,scale and other information is obtained.Then the construction of the framework of the backend realized by tightly coupled nonlinear optimization has carried on,and IMU preintegration residual function was deduced respectively with reprojection error function,and thus the overall residual function of the system is built,and solve the the value aimed to optimize at the same time,and in order to realize the real-time of the algorithm,we take the strategy of the marginalization of the sliding window strategy to carry on the weed out the old state variables and adding new state variable information,thus enables the system to keep a high calculating speed under long time.Finally to verify the algorithm performance,we respectively use EuRoC dataset and a simulation of the visual scene in which IMU data and feature point data is generated by the simulation environment.Finally,to evaluate the precision of algorithm,we use the EVO tool to calculate the trajectory error.
Keywords/Search Tags:IMU, Vision Odometry, Pre-integral, Integrated navigation, Sliding window
PDF Full Text Request
Related items