| With China’s rapid economic development,domestic vehicles are becoming more and more popular,but the number of traffic accidents has also increased.In order to improve driving safety and reduce the number of traffic accidents,modern technologies such as autonomous driving can be used to improve driving safety by eliminating human driving errors.At the heart of this technology is the ability to sense the driving environment by installing sensors in the vehicle to enable effective awareness of the road environment and rapid response to driving conditions.Commonly used sensors in smart car environment sensing include millimetre wave radar and cameras.Millimetre wave radar can accurately sense information such as speed,orientation and distance of a target,and vision sensors have powerful target classification capabilities.However,visual perception has difficulty distinguishing the size and proximity of objects in an image,which can be fatal for road driving detection.One solution to this problem is the use of Bird’s Eye View(BEV)technology,which is an image viewed from above,to help vehicles obtain real-time information on the road unobstructed.In this thesis,we choose to fuse millimetre wave radar information with visual information and output the fused sensory results in the form of a bird’s-eye view that facilitates vehicle decision planning,taking full advantage of both to further improve the accuracy of vehicle decision making.The main elements of this research are as follows:(a)To address the problems of environmental noise and multipath interference in millimetre wave radar signals,data pre-processing including de-noising,Kamanl filtering and other filtering methods are required to remove the interference.The cutting-edge 4D millimetre wave radar point cloud data and algorithms are also discussed.(b)To address the problem of vision algorithms that have difficulty in distinguishing the size and proximity of objects in an image,a bird’s-eye view target detection method is used to represent the output as a high top view image.An efficient and improved BEV projection method is chosen to encode image information from the perspective space features of the camera onto the BEV.(c)The millimetre wave radar and vision fusion network is designed to address the problem of difficulty in fully and accurately sensing the surrounding environment with a single sensor.The output view is also designed to convert the detected target into a BEV view to complete the fusion network.(d)A detection platform for machine vision and millimetre wave radar fusion networks was built for the designed fusion network.The design of the hardware platform was completed by selecting appropriate radar and camera and other equipment,and then the fusion model approach was implemented on the software system,and real vehicle data acquisition was carried out in a real driving environment.After experimental validation and data statistics,the proposed fusion network comparative vision algorithm improves the detection accuracy by 5.5% and the detection speed by 4.9 frames per second,indicating that the proposed fusion network comparative vision algorithm improves the detection speed and ensures the accuracy of target detection,and outputs the results in a BEV form that facilitates vehicle planning,improving the decision planning of the vehicle. |