At present,it is in the era of rapid development of visual navigation research,and a large number of algorithms in this direction have made great research and progress.In the field of autonomous driving,the integration of vision and inertial navigation systems has made the design of driverless robots in this field have a leap forward in the navigation and mapping of unknown areas.When the autonomous driving robot is in the actual application scenario,the process of obtaining information through visual sensors and performing pose estimation of the carrier and completing the simultaneous localization and mapping(SLAM)will be affected by the failure of feature tracking,poor robustness and loss of feature recognition during rapid movement due to light changes,and reducing these factors is the main task of each navigation system.Due to the limitations of vision sensors,to complete the design of high-precision unmanned driving systems,a single sensor alone cannot meet the requirements.The fusion of vision sensors with the inertial navigation system greatly improves the robustness of the entire navigation system and increases the accuracy of the system.The lightweight navigation system built by the integration of vision and inertial navigation system can make the advantages and disadvantages of the two complement each other,and can ensure the advantages of low cost and small size of the system.The loosely coupled method is used to integrate the vision system and the inertial navigation system to realize the high-precision navigation and positioning of the robot.Based on the mainstream ORBSLAM algorithm,a method of re-extracting features from sparse areas of feature points is proposed to construct a vision system,which ensures a more comprehensive expression of keyframe information and provides a more complete point cloud for back-end mapping.Estimation of camera pose is accomplished by polar geometric constraints and triangulation.The kinematic model of IMU was established,and the pose information of IMU was estimated by the median method.Visual data is loosely coupled to IMU data by extended Kalman filtering.The fused and optimized data is added to the loopback system to reduce the accumulation error of the system,so as to ensure the real-time performance of the system and achieve more accurate state estimation. |