Font Size: a A A

Research On The Simplification Method Of Neural Network Mode

Posted on:2024-04-07Degree:MasterType:Thesis
Country:ChinaCandidate:S WangFull Text:PDF
GTID:2568306908985439Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
As more and more portable and low-power devices start using deep learning models for computer vision tasks,there is an increasing demand for simplifying neural network models.Neural network pruning,as a major player in model compression algorithms,has naturally become a hot topic in related research.This article specifically focuses on the following three aspects of pruning methods:(1)Scoring Net is a neural network-based pruning criterion that addresses suboptimal solutions resulting from hand crafting in traditional pruning methods.Compared to other methods that use genetic algorithms and evolutionary strategies,Scoring Net’s reliance on neural networks leads to faster training speed and reduced consumption of computational resources for achieving comparable results.Experimental verification has demonstrated that it can achieve results that are at least not inferior to other pruning algorithms,and also facilitate transfer learning between datasets and tasks.(2)To address the issue in the lottery ticket hypothesis that reinitialization may render selected `winning tickets’ ineffective,this article proposes three reinitialization methods: random initialization with preserved sign,random initialization with preserved distribution,and random initialization with preserved absolute value.These approaches aim to identify which specific factors may lead to this outcome.Based on the study presented in this paper,random initialization with preserved distribution can best retain the advantageous ’winning ticket’ properties.(3)Considering that using different loss functions and optimizers for different tasks can lead to better results,this study proposes a method of using different loss functions and optimizers for Scoring Net in different stages.This approach effectively enhances Scoring Net’s performance,and accelerates the convergence of the pruned network during the training process after pruning.
Keywords/Search Tags:Convolutional Neural Network, Structured Pruning, Model Compression, Pruning at Initialization, Pruning Criterion
PDF Full Text Request
Related items