Font Size: a A A

A Research On Simultaneous Localization And Mapping Of Mobile Robot With Omnidirectional Vision

Posted on:2011-01-14Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y Q WangFull Text:PDF
GTID:1118330332960174Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
The research on simultaneous localization and mapping (SLAM) of mobile robot with omni-directional vision is a very active subject now. Omni-directional vision sensor is more and more used in this filed because it offer a rich source of environment information in a wide angle of view. Usually, the omni-directional image is transformed to a normal visual image first, but it is a very complex process with a low efficiency. At the same time, the temporal complexity of vision-based SLAM is not real-time. So, the SLAM of mobile robot with omni-directional vision is researched in which the omni-directional image is not needed to be transformed.First of all, the scale invariant feature transform (SIFT) and modified algorithm for features extraction and matching in omni-directional are researched. An angle constraint is used to eliminate wrong matching in valid region of omni-directional images. A modified approach based sampling and mean shift algorithm is proposed to reduce the number of features generated by SIFT as well as their extraction and matching time. The features number is controlled by the number of sampling point, and mean shift algorithm is used to search local extrema points actively in scale space to improve the efficiency. It is demonstrated that the time of feature extraction and matching is reduced obviously by modified algorithm and the feature matching is steady and accurate.Secondly, the system of mobile robot SLAM with omni-directional vision sensor is researched. The motion model of four-wheel robot is simplified to two-wheel differential model. The principle and structure of the catadioptric hyperboloid omnidirectional vision sensor are researched, and the real parameters are calculated. The perceptual model is proposed by combining the omni-directional pixel coordinate and odometer data to get the three-dimensional coordinate in robot coordinate system. The SLAM result is achieved by combining the motion model and perceptual model iteratively. Thirdly, the of SLAM system based on Bayesian filter is researched. The motion model and perceptual model both noises included are combined to get an accurate system state estimation iteratively. It is demonstrated that the accuracy of system state estimation is improved by the uncertain information processing method, but the main problem is the temporal complexity of data association in SLAM. The FastSLAM algorithm is chose for omni-directional mobile robot SLAM because of its best temporal complexity.Finally, time optimization method of SLAM is researched. The modified SIFT is used to reduce the time of features extracting and matching in omni-directional image, and the number of features in SLAM is reduced too. It is demonstrated that the modified SIFT is steady and effective for SLAM time optimization. And a dynamic management method of feature database based on matching number and matching continuity is proposed. It is demonstrated that the utilization ratio of features and the efficiency of SLAM data association are both improved by the time optimization method, which is a good solution to keep real-time for the omni-directional mobile robot SLAM.
Keywords/Search Tags:mobile robot, simultaneous localization and mapping, omnidirectional vision, scale invariant feature transform, mean shift
PDF Full Text Request
Related items