Font Size: a A A

Research On Positioning And Mapping Technology Of LiDAR/Camera/INS/GNSS For Robot

Posted on:2022-05-04Degree:MasterType:Thesis
Country:ChinaCandidate:L X ZhangFull Text:PDF
GTID:2518306497991829Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
In order to meet the requirements of continuous,accurate positioning and mapping for robots with low-cost sensors in complex campus environments,this thesis proposes a positioning and mapping technology of Li DAR/Camera/INS/GNSS for robot to solve the problem of dense Li DAR is too expensive and the low-cost sparse Li DAR-camera system is not perfect.The main research contents and contributions of this thesis are as follows:Aiming at the problems that the Li DAR-camera calibration methods proposed by predecessor require specific targets or sites,which result in the operation being not simple enough.And the accuracy of them are not high enough.this thesis proposes a Li DAR-camera calibration method based on infrared characteristics.This method only needs to use the corner structures that can be seen everywhere in daily life and use the images that can record the Li DAR infrared scanning features,while ensuring its simplicity,its calibration results can reach the current advanced level.Firstly,the feasibility of using infrared features for calibration is analyzed.The quantum efficiency of ordinary industrial cameras and the wavelengths of Li DAR are both compared;Secondly,2D points corresponding to infrared features are extracted after image preprocessing and 3D points are extracted by surface fitting.Thirdly,the accuracy of the 3D points extracted by the surface fitting proposed in this thesis is verified through simulation experiments.Finally,the calibration method proposed in this thesis is fully verified through the three aspects,mean square error,reprojection error and projection effect.The calibration accuracy of the Li DAR-camera obtained by the method proposed in this thesis is about 0.1 degrees and 0.5 cm,and the reprojection error in the X and Y axes direction on the image are both less than 1.4 pixels.The calibration projection effect is more accurate than the mainstream open source automatic driving software Autoware.The proposed method can reach the current advanced level while ensuring its simplicity.Aiming at the problems that dense Li DAR is too expensive while sparse Li DAR cannot extract feature and the low-cost sparse Li DAR-camera system is not perfect,this thesis designs a real-time feature fusion scheme of asynchronous Li DAR-camera under dynamic conditions.Firstly,the spatiotemporal datum of Li DAR,camera,GNSS,IMU and other sensors are unified.The external parameters of Li DAR and camera are obtained by the calibration method proposed in this thesis;Secondly,the asynchronous sampling point clouds and images under dynamic conditions are registered with the positioning results of GNSS / INS integrated navigation;Thirdly,the semantic features of the dynamic target in the image are obtained through deep learning,the geometric features of the point cloud are obtained by segmentation and clustering.Under the condition,the point cloud and image are merged at the feature level;Finally,the proposed GNSS/INS/ODO/LO fusion scheme is tested and evaluated based on the mobile robot platform in the campus complex environment with a large number of poor GNSS signals.Compared with GNSS / INS fusion scheme,the RMS of "North-East-ground" error direction are improved by 79.5%,55.4% and 68.1%,and the RMS of heading error is improved by 11.0%,The maximum error in the "North East ground" direction decreased from 9.44 m,3.46 m and 3.29 m to 0.79 m,1.20 m and 0.94 M respectively;The average time consumption on the computer and the embedded processor Xavier are 55.5ms and 37.8ms.The method proposed in this thesis can well integrate the feature information of sparse Li DAR point cloud and camera image,and both can achieve the real-time processing effect of 10 Hz frequency.Aiming at the problem of inaccurate positioning and mapping results due to the complex campus environment,this thesis proposes a robot positioning and mapping method based on Li DAR,camera,INS and GNSS.Firstly,the Li DAR odometer assisted by inertial navigation mechanizations is designed to improve the accuracy of the Li DAR odometer by using the short-term high-precision characteristics of the inertial navigation;Secondly,the Li DAR odometer speed and GNSS/INS integrated navigation are fused in real time based on the error state Kalman filter.Fusion;Thirdly,use the high-precision positioning results of Li DAR/GNSS/INS fusion to convert the point cloud to the map coordinate system,and build a feature point cloud map and an RGB point cloud map based on the point cloud image feature fusion result;Finally,the proposed method was tested and evaluated based on mobile robot platform in complex campus environment in which gnss observation was poor.The Li DAR/GNSS/INS fusion positioning algorithm designed in this thesis can effectively restrain the divergence of positioning error,by reducing the position error by 85% and the attitude error by 32.5%.And the maximum error is less than 1.5 meters.The feature point cloud map constructed in this thesis can well mark or eliminate the dynamic features in the point cloud map.Roads,zebra crossings and lane lines in the constructed RGB point cloud map can be seen clearly,which proves that the proposed algorithm works well.Especially for the situation where there are many pedestrians and vehicles in the complex campus environment and the robot has not yet realized the automatic driving function,the point cloud map constructed by the general scheme will contain many dynamic objects such as robot operators,cars and pedestrians on the road.This method proposed in thesis can solve this problem well.In summary,in order to meet the needs of continuous and accurate positioning and mapping of robots equipped with low-cost sensors in the complex campus environment,this thesis solves the problems of the inadequate of low-cost sparse lidar camera system and inaccurate of positioning and mapping in complex campus environment from three aspects:Li DAR-camera high precision calibration,asynchronous sampling of point cloud and image feature fusion under dynamic condition,and positioning and mapping base on Li DAR,camera,INS and GNSS.The system is tested and evaluated in the complex environment of the campus based on the mobile robot platform.The results show that the positioning accuracy has been significantly improved.The roads,zebra crossings,and lane lines in constructed RGB point cloud map are clearly visible,which also confirms the accuracy of Li DAR-camera calibration and asynchronous sampling point cloud and image registration under the dynamic contition.
Keywords/Search Tags:LiDAR-Camera calibration, LiDAR-Camera system, Multi-source fusion positioning and mapping, GNSS/INS integrated navigation
PDF Full Text Request
Related items