Font Size: a A A

Research On Soft Filter Pruning Algorithm Of Neural Network Based On Differential Equation

Posted on:2024-09-17Degree:MasterType:Thesis
Country:ChinaCandidate:G M YangFull Text:PDF
GTID:2568307103970049Subject:Computer technology
Abstract/Summary:PDF Full Text Request
The main purpose of network pruning is to reduce the number of parameters and computation in neural networks in order to compress models,speed up model inference and reduce model storage space,enabling models to run on devices with limited hardware resources,such as mobile devices and embedded devices.Many network pruning algorithms have been proposed,but these algorithms still suffer from problems such as capacity impairment and partial loss of valid information during model training.In this thesis,we propose a soft filter pruning algorithm for neural networks based on differential equations to improve the performance of the pruned networks by ensuring that the model capacity is not affected during training and reducing the loss of information,and extend the soft filter pruning algorithm to the field of object detection to verify the generalization ability of the algorithm.The main contributions of this thesis are as follows:(1)In order to solve some problems in hard filter pruning,such as model capacity reduction and effective information loss during training,a soft filter pruning algorithm for neural networks based on element decay differential equations is proposed.The algorithm mainly includes four steps: filter selection,weight decay,model training and filter removal.After each round of training,the filters are selected by calculating the L2-norm of all the filters and sorting them from highest to lowest,and the filter with the lowest parity is the selected filter.The selected filters are decayed according to the formula introduced by the element decay differential equation.Then,the model is trained for the next round.After iteratively cycling through the above three steps,the filters with all zero weights are removed from the network model to obtain the final pruned model.Through extensive comparison experiments,it is verified that the algorithm achieves better results on both shallow and deep networks,especially on the deep network and even improves the accuracy value of the original network.This indicates that the decay-based soft filter pruning reduces the information loss in the network.(2)In purpose of solving the problem of excessive decay in the soft filter pruning algorithm of neural networks,which leads to excessive loss of network accuracy after pruning,a soft filter pruning algorithm for neural networks based on the logistic growth differential equations is proposed.The algorithm consists of four main steps: filter selection,weight decay,model training and filter removal.In the filter selection step,select all filters in the network based on their L2-norm,and filters with low L2-norm are subjected to the decay step.Then,in the filters selected in the previous step,their weights are decay according to the three-stage adaptive decay formula introduced by the differential equation based on the logistic growth model,replacing the original element decay formula.Then,model training is performed,i.e.,the network model is decayed and then forward propagated and back propagated once more.After the cycle of the first three steps,the network training is completed and the all-zero filter will be removed.Experiments show that the algorithm solves the problems of the algorithm in Chapter 3,and the performance of the pruned network is better than that of the algorithm in Chapter 3.This indicates that the soft filter pruning algorithm for neural networks based on the logistic growth model has better performance and can effectively improve the stability of the network.(3)In aim to solve the problem that pruning algorithms are rarely applied in object detection networks,a pruning scheme for object detection networks based on the pruning algorithms in this thesis is proposed in order to verify the generalization ability of the two soft filter pruning algorithms in this thesis.Firstly,only the Backbone part of the YOLOv5 l network is pruned,which plays the same role as the classification network for feature extraction.The experimental results show that a good mean average precision is maintained despite the decreasing number of network parameters and computational.In addition,pruning of the Neck part,which plays the role of edge regression,is also added.By experimenting with both Backbone and Neck parts,the results show that the mean average precision value decreases little after adding pruned objects.The scheme verifies the generalization ability of the soft filter pruning algorithm of Chapters 3 and 4 on object detection networks.
Keywords/Search Tags:Soft Pruning, Element Decay, Logistic Growth, Self-Adaptive, Object Detection
PDF Full Text Request
Related items