Font Size: a A A

Bolt Defect Image Classification Of Transmission Line Based On Knowledge Distillation

Posted on:2022-09-01Degree:MasterType:Thesis
Country:ChinaCandidate:C X JinFull Text:PDF
GTID:2492306566478234Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
The two-dimensional visual information of bolt images of transmission lines is poor,which leads to large model resource consumption and low classification accuracy of small models during bolt image classification.This topic introduces the knowledge distillation technology based on deep learning to the bolt image classification task of the transmission line.According to the characteristics of the self-built data set,the research is carried out from the two perspectives of network performance transmission and network structure matching to ensure that the small model that meets the large-scale deployment can fully fit the bolt defect image classification performance of the large model,and achieve a balance between resource consumption and accuracy.Aiming at the problem of bolt images with small inter-class differences and large intra-class differences,which lead to unsatisfactory or incorrect knowledge transfer of large model,this topic proposes a method of bolt defect images classification based on dynamic supervised knowledge distillation from the perspective of network performance transfer.The adaptive weighting method is used in the output layer of the network to improve the accuracy of the small model in learning bolt defect labels;the attention transfer mechanism is used in the hi dden layer of the network to improve the bolt feature expression ability of the small model.The large model(Resnet-40-2)combines the adaptive weighting method of the output layer with the attention transfer mechanism of the hidden layer to guide the training of the small model(Resnet-16-1),which fully improves the bolt defect classification ability of the small model.The proposed knowledge distillation method was verified experimentally on the self-built bolt defect image classification data set.The results show that the classification accuracy of the small model is 89.28%,whic h is an increase of 2.17%.The classification accuracy of the small m odel and the large model is only 0.63% difference,and the parameter amount of the small model is only7.8% of the parameter amount of the large model.Aiming at the problem of bolt images with the low resolution and poor visual information,resulting in poor classification performance of streamlined small model,this topic proposes an optimization method based on optimal knowledge transfer for the knowledge distillation model from the perspective of network structure adaptation.First,the structure of the streamlined small model is reduced to 3 residual blocks.The reduced model retains the network structure information of the low-level,middle-level,and high-level.At the same time,the feature expression dimension of the bolt image of the large model is expanded in order to increase the bolt knowledge transferred to the streamlined small model.Then large models of different widths use the output layer knowledge distillation algorithm and the hidden layer attention transfer algorithm to guide the streamlined small model training respectively.In order to select the large model with the best performance of transferring bolt knowledge,an evaluation index of knowledge deviation is proposed to visualize the degree of bolt knowledge transfer from the large model to the streamlined small model.Comprehensively analyze the knowledge deviation and the classification accuracy of the simplified small model,and determine the optimal knowledge tran sfer model for the large model with a width of 5.Finally,the optimal knowledge transfer model(Resnet-40-5)combines the adaptive weighting algorithm and the attention transfer algorithm to guide the training of the simplified small model(Resnet-10-1).It is verified on the self-built bolt defect image classification data set.The results show that the classification accuracy of the streamlined small model is 88.85%,which is an increase of 5.59%.The classification accuracy of the streamlined small model and the large model is only 2.14% difference,the knowledge deviation is 0.28,and the parameter amount of the small model is only 0.56% of the parameter amount of the large model.
Keywords/Search Tags:defect of bolt, knowledge distillation, attention transfer, knowledge deviation, adaptive weighting
PDF Full Text Request
Related items