Font Size: a A A

The Research On Key Technology Of Simultaneous Localization And Mapping In Autonomous Driving Scenes

Posted on:2022-03-16Degree:MasterType:Thesis
Country:ChinaCandidate:B SongFull Text:PDF
GTID:2480306341455914Subject:Geodesy and Survey Engineering
Abstract/Summary:PDF Full Text Request
As a significant product in the new infrastructure era,autonomous driving involves positioning,navigation,path planning,intelligent control and other technical fields.Meanwhile,it also involves these traditional algorithm fields such as Kalman filter,particle filter,and cutting-edge algorithm fields such as deep learning,pattern recognition and image processing.Among then,the localization and navigation directions in the field of autonomous driving technology include GNSS,IMU and other global localization methods and local localization methods such simultaneous localization and mapping(SLAM)using LiDAR and vision camera.However,there are fewer products based on various SLAM methods in actual operation.The reliability and confidence of various SLAM algorithms in the actual environment are not high.There is lack of sufficient road test environment data support.These problems are not conducive to the promotion and large-scale commercialization of automatic driving in the automobile market in the future.It means that there is still space for optimization in different degrees and directions.The SLAM experiments based on a single sensor such as LiDAR and vision camera and the field measurement research under the actual environment of the fusion of the two sensors are respectively studied in this paper.In this paper,the four-wheel drive differential steering chassis is used as the basic experimental chassis,equipped with NVIDIA's Jeston Nano processor.According to the actual requirements,a basic experimental platform with length,width and height parameters of 30cm,26cm and 7cm is built.At the same time,the sensors such as 16-wire LiDAR and depth camera are installed on the experimental platform to carry out the SLAM experiment of single sensor and the exploratory SLAM experiment of the two sensors information fusion.The experiment showed that:(1)For the SLAM experiments with different sensors in the same environment,under bright environment,the maximum deviation of the characteristic parameters and the characteristic points of the SLAM experiment results based on the 16-line LiDAR is 1.961m,the average error is 0.398m,and the median error isą0.644;In the SLAM experiment results based on the depth camera,the maximum deviation is 1.519m,the mean difference is 0.112m,and the median error isą0.203.In the dark environment,the maximum deviation of feature parameters and shape parameters of feature points of SLAM experimental results based on 16-line LiDAR is 2.004m,the average error is 0.477m,and the median error is ą0.683.In the SLAM experiment results based on the depth camera,the maximum deviation is 1.154m,the mean difference is 0.143m,and the median error isą0.282.In general,in the SLAM experiment of a single sensor,the experimental accuracy and final effect based on the depth camera are due to the accuracy and effect based on the 16-line LiDAR.(2)For the SLAM experiment of the same sensor in different environments,with the SLAM experiment of 16-line LiDAR,the maximum deviation of the shape parameters of the characteristic variables and feature points in the SLAM experiment results under bright environment is 1.961m,the mean error is 0.398m,and the median error isą0.644.In the dark environment,the maximum deviation of the shape parameters between the feature quantity and the feature point is 2.004m,the average error is 0.477m,and the median error isą0.683.In general,the median error of LiDAR results in bright environment is 5.71%better than that in dark environment.For the SLAM experiment of the depth camera,the accuracy of the result in bright environment is 28.01%better than that in dark environment.In conclusion,the experimental results based on the bright environment are better than the dark environment,which is more conducive to the guarantee of automatic driving safety.(3)In the exploratory SLAM experiment of multi-sensor data fusion,the manual and automatic calibration experiments are carried out respectively.Finally,the calibration results of the two are superimposed and the preliminary fusion accuracy is analyzed.The mean square error of initial fusion is 0.0002 under manual calibration and 0.0017 under automatic calibration,which proves the feasibility of multi-sensor fusion SLAM.Figure[34]Table[13]Reference[80]...
Keywords/Search Tags:Autonomous Driving, SLAM, LOAM, RTAB_MAP, Multi-Sensor Joint Calibration, Multi-Sensor Fusion SLAM
PDF Full Text Request
Related items