Font Size: a A A

Robust Tightly-coupled Monocular Visual-inertial SLAM With Wheel Odometer

Posted on:2020-10-25Degree:MasterType:Thesis
Country:ChinaCandidate:Z Z LuFull Text:PDF
GTID:2428330590958279Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
The Visual SLAM method is widely used in self localization and mapping in complex environments.But with only camera,it is difficult to deal with low light,weak texture,and fast motion.Visual-Inertia SLAM which combines camera with IMU can significantly improve the robustness and make scale week-visible while Monocular Visual SLAM is scale-invisible.The ground mobile robot is often in uniform motion or purely rotating.At this time,the Monocular Visual Inertia SLAM lacks acceleration excitation and cannot observe the scale,which will cause serious positioning error.For the ground mobile robot,the introduction of the wheel speed sensor can solve the above-mentioned scale weak-visible problem and improve the robustness under abnormal conditions.In this paper,a multi-sensor fusion SLAM algorithm using monocular vision,inertial and wheel speed measurement is proposed.The sensor measurements are combined in a tightly-coupled manner,and the nonlinear optimization method is used to maximize the posterior probability to solve the optimal state estimation.And proposed algorithm has loop detection and back-end optimization capabilities.The main research results include: the Mecanum wheel control algorithm based on torque control,which can estimate the reliability of speed measurement by using motion constraint error;chassis-IMU intrinsic and extrinsic calibration algorithm for omnidirectional vehicle,no auxiliary equipment needed;wheel odometer pre-integration algorithm which combines chassis speed and IMU angular speed can avoid repeated integration;state initialization algorithm based on wheel odometer and IMU;active detection algorithm of chassis abnormal state,can actively isolate bad chassis speed measurement.Comparative experiments were carried out in room scale scenes,building scale scenes,and anomalous scenes such as wheel slip,collision,kidnapping,and visual loss.The results show that proposed algorithm has high accuracy,2.2 m of cumulative error after the motion of 812 meters(0.28%,loopback optimization disabled);strong robustness,effectively localization even in the case of sensor loss such as visual loss,fast motion,kidnapping,slippage,etc.Accuracy and robustness of proposed method are superior to Monocular Visual Inertia SLAM and traditional wheel odometer.
Keywords/Search Tags:Multi-sensor Fusion, Robot Pose Estimation, Simultaneous Localization and Mapping, Visual Inertia System, Sensor Calibration
PDF Full Text Request
Related items