| Object detection is one of the basic tasks in the computer vision field and the object detection methods based on deep learning has been widely studied.The research work focuses on improving the performance of the one-stage object detection algorithm YOLO,and the main contributions are as follows:First,an improved YOLOv3 algorithm YOLOv3-FE based on feature enhancement is proposed.YOLOv3-FE can alleviate the problems of insufficient receptive fields and weak feature expression ability to improve the detection accuracy.In YOLOv3-FE,a receptive field enhancement module is designed before the feature pyramid to enlarge the receptive fields while ensuring the resolution.At the same time,the features of different branches are fused to obtain more abundant information.In addition,a partial decoder module is adopted to differently process the features of different layers in the feature pyramid network to enhance the expressiveness of the features.Second,an improved YOLOv4 algorithm YOLOv4-FE based on adaptive feature fusion is proposed.YOLOv4-FE can effectively overcome the missed detection problem caused by the insufficient feature fusion due to the path aggregation network and the problem of not simultaneously realization of classification and localization caused by the detection head used in YOLOv4.In YOLOv4-FE,an adaptive spatial feature fusion module is designed to assign different weights to each layer features,from which the weight parameters are adaptively learned.In addition,the decoupled head is used with two branches,which are for localization and classification respectively to further enhance the robustness of the features and improve the detection accuracy.Third,experiments are carried out on different datasets.The YOLOv3-FE model improves the detection accuracy m AP by 3.7% on PASCAL VOC dataset.YOLOV4-FE improves the m AP by 1.81% on the PASCAL VOC dataset and 5.33% on the Cityscapes dataset.The results show that the proposed model can improve the detection accuracy while ensuring the real-time performance.Figure 51;Table 9;Reference 77... |