| With the development of UAV technology,aerial photography as a new way of image acquisition has gradually become an important method for human to collect data and understand the world.Real-time vehicle detection for UAV imagery is a significant application of aerial photography.As an important part of intelligent transportation system,it plays an important role in real-time traffic acquisition,highway patrol,illegal parking treatment and so on.The vehicle targets in aerial imagery are usually small and easy to be occluded by the road environment.Therefore,it is difficult to obtain accurate detection results by using the general object detection algorithm directly.Nowadays,more complex algorithms are used to improve the detection accuracy of aerial vehicles,which is difficult to meet the real-time detection needs in practical applications.In this paper,a real-time vehicle detection method based on deep learning is proposed for vehicle detection in aerial scenes,which can achieve accurate vehicle detection in real-time.The main works include two aspects in the following:Firstly,aiming at vehicle detection problems in aerial scene,a real-time vehicle detection method from aerial imagery based on multi-scale feature fusion is proposed in this paper.As is known to us all,the multi-scale feature representation in the classical object detection network has different semantic and spatial resolutions at different scales.In this paper,we combine low-resolution,semantically strong features with high-resolution,semantically weak features effectively via a multi-scale feature fusion module,making each level of the in-network feature pyramid semantically strong,in order to improve the vehicle detection performance,especially for small vehicles and occluded vehicles.Meanwhile,this method analyzes the distribution of the aerial vehicle dataset as well as the effective receptive field inside the network,and then proposes a method for setting the appropriate default boxes of the network to improve the detection results.This method will solve the vehicle detection problems caused by the small scale and easy occlusion,and realize real-time and accurate detection of the vehicles from UAV imagery.Secondly,there exist some detection problems because of the low detection accuracy of hard examples in vehicle dataset.In this paper,we propose a method to balance the loss contribution between hard and easy examples during training based on alternate training strategy using multiple loss functions.By combining the advantages of Cross Entropy Loss and Focal Loss,this method will make both easy examples and hard examples trained adequately,so that the detection model can learn more discriminative feature representation.Experimental results show that this method can greatly improve the detection accuracy of hard examples,while ensuring the high detection precision of the easy examples.In conclusion,the real-time detection method proposed in this paper can achieve accurate and real-time vehicle detection from UAV imagery.In addition,the aerial vehicle dataset collected and labeled in this paper can be used in the related research of aerial vehicle detection,which has certain theoretical research value and practical application value. |