Font Size: a A A

Research On Traffic Sign Recognition Based On Transfer Learning

Posted on:2022-02-24Degree:MasterType:Thesis
Country:ChinaCandidate:W Y LiuFull Text:PDF
GTID:2492306554971149Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
Deep learning related technologies are developing rapidly and have been widely applied in the field of traffic sign recognition.On the one hand,the structure of traditional traffic sign recognition model is complex,and it takes a lot of time to train from scratch.On the other hand,although transfer learning saves training time,the structure and parameters of the original model(the teacher model)are very similar to those of its derivative model(the student model),and the adversarial samples of the teacher model are easily misclassified by the student model.At present,the "Fingerprint" recognition method can be used to accurately find the teacher model corresponding to the student model.If the adversary can successfully attack the teacher model,the student model put into application is also vulnerable to the same attack.In order to solve the above problems,based on the theory of knowledge distillation,this paper first puts forward a new scheme for training traffic sign recognition model.As a transfer learning method,knowledge distillation can not only compress the model structure,but also improve the robustness and training efficiency of small architecture models.The Distillation-VGGNet trained in this paper can not only identify traffic signs with high precision,but also can be easily deployed on a large scale.Secondly,a lightweight defense method is proposed to improve the robustness of the Distillation-VGGNet student model.The main work of this paper is as follows:(1)In view of the problems of the traffic sign classification model training process requiring a large amount of label data,the model structure is complex and inconvenient for large-scale deployment,this article first preprocesses the self-collected traffic sign data,then designs the traffic sign recognition model Distillation-VGGNet and trains it by knowledge distillation.Finally,the simulation test shows that the recognition accuracy of the student model on CN-TSD reaches 98.8%,which is higher than that of the current mainstream model.(2)Aiming at the problem that the student model is vulnerable to the adversary of its teacher model,a method of increasing the weight difference is proposed.On the basis of not significantly reducing the accuracy of the student model’s recognition of clean samples,the student model’s ability to classify adversarial samples of the teacher model is improved.This paper uses the PGD(Stochastic Gradient Descent)algorithm and the CW2(Carlini-Wagner)algorithm to generate adversarial samples of the teacher model,and load them into the distilled student model before and after the improvement to evaluate the robustness of the student model.The experimental results show that the accuracy of the improved student model against CW2 attack and PGD attack is increased by 20% and16.7% respectively,which shows that the improved method proposed in this paper makes the student model more robust.
Keywords/Search Tags:Traffic Sign Recognition, Transfer Learning, Knowledge Distillation, Weight, Robustness
PDF Full Text Request
Related items