With the rapid development and progress of human society and science and technology,mobile robots are increasingly widely used in civil and military fields.Among them,autonomous navigation function is the basis of mobile robot application.Therefore,many autonomous positioning and navigation methods have emerged.As a typical representative of visual SLAM,ORB-SLAM2 can accomplish tasks such as simultaneous positioning and mapping in different scenarios.But its working environment is generally limited to static,rigid,no human interference scene,unable to provide accurate positioning information and navigation map in the indoor dynamic scene,therefore,ORB-SLAM2 applicable to the work scene is very limited.In addition,ORB-SLAM2 is susceptible to the rapid conversion of camera field of view,light changes and other external factors,resulting in insufficient positioning accuracy of mobile robot or poor mapping effect.Therefore,this paper studies the problems such as low positioning accuracy of mobile robot and poor map construction effect in the indoor dynamic scene of ORB-SLAM2.The specific research content is as follows:(1)Aiming at the problem that the positioning accuracy and drawing effect of ORB-SLAM2 in indoor dynamic scene are easily affected by dynamic target factors.Firstly,based on the ORB-SLAM2 algorithm framework and the principle of multithreading,the deep learning theory is introduced to realize the dynamic target detection,and the ORB feature points of the static area and the edge of the non-dynamic target mask are used to estimate the camera position and pose.Then,the extraction process of ORB feature points was optimized,and the image pyramid method and Gaussian filtering algorithm were adopted to eliminate the poor ORB feature points,so as to improve the quality of image key frames and the accuracy of feature matching.Finally,add dense drawing thread,realize three-dimensional dense point cloud map construction,in order to improve the positioning accuracy and drawing effect of ORB-SLAM2 in the dynamic scene.(2)Aiming at the problem that the positioning accuracy of ORB-SLAM2 is easily affected by external factors such as rapid conversion of camera field of view or light change in indoor dynamic scene.Based on the complementarity between vision and IMU,a dynamic nonlinear optimization algorithm based on the tight coupling of RGB-D camera and IMU is proposed in this paper to realize the accurate positioning of ORB-SLAM2 in the state of fast motion.Firstly,the IMU pre-integration is derived,and the process of Visual-IMU joint initialization is simplified to realize the anti-jamming automatic initialization of the proposed algorithm.Then,the marginalization and adaptive adjustment factor based on sliding window are introduced to realize the tight coupling nonlinear optimization of Vision-IMU.Finally,through sensor calibration and data synchronization program design,the bidirectional Visual-IMU fusion is realized,which further improves the positioning accuracy and robustness of the proposed algorithm in indoor dynamic scenes.(3)Aiming at the feasibility and practicability of the proposed algorithm,computer simulation and experimental tests are carried out.Firstly,simulation and comparison experiments are conducted on ORB-SLAM2,Dyna SLAM and the algorithm in this paper under static and dynamic scenarios using TUM data sets,so as to evaluate the positioning accuracy and mapping effect of the algorithm in this paper.Then,in the real scene,the localization and map construction effects of the proposed algorithm on the physical prototype are verified.Finally,the experimental results show that compared with ORB-SLAM2 and Dyna SLAM,the proposed algorithm has better positioning accuracy and mapping effect,and can achieve accurate positioning and dense mapping of mobile robots in indoor dynamic scenes,with better real-time performance and robustness. |