Font Size: a A A

Optimizing Deep Neural Network By Using The Relationship Of Parameters

Posted on:2020-11-22Degree:MasterType:Thesis
Country:ChinaCandidate:Y F HeFull Text:PDF
GTID:2428330605979269Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Deep neural network is one of the important research directions in the field of artificial intelligence.With the development of deep neural network,the scale of parameters of network is increasing.Although the more parameters bring better performance,it also makes the network more difficult to train and easier to over-fitting,and excessive parameters can cause performance degradation.In order to solve these problems,it is necessary to analyze the relationship of parameters within the network and optimize the deep neural network by using the relationship of parameters.The main contents of this paper are as follows:Firstly,the effects of the relationship of various parameters on the performance of the deep convolution neural network are analyzed by experiments.The greater the absolute value,the greater the contribution of the weight connection parameter to the network.The performance of the model mainly depends on few large weight connection parameters.There is a positive correlation between the number of convolution kernels and the generalization ability of the network.The influence of weight connection parameters with weak volatility on network convergence is greater than that with strong volatility.The initialization weight connection parameters of normal distribution are more stable than that of uniform distribution.Secondly,an optimization algorithm based on large weight suppression is proposed,which adjusts the weight updating strategy during model training and sets a suppression coefficient related to the value of the weight.The coefficient adjusts the weight increment of the back propagation to achieve the purpose of controlling the distribution of connection with a large weight.In different experimental conditions,this method can reduce the sensitivity of the model to large weight parameters,improve the generalization ability and robustness of the model,and effectively suppress the over-fitting of the network.This method can be used to re-optimize the trained network model.Thirdly,the optimization performance of the algorithm is evaluated.The time complexity and space complexity of the algorithm areO?wlog2 w?and O?w?.Experimental results show that the contribution degree of large weight in the optimized network is dispersed to more weights.It is verified that the large weight suppression algorithm can optimize the stability during training network.
Keywords/Search Tags:Deep Neural Network, Weight Connection Parameters, Suppression Coefficient, Generalization Ability, Robustness
PDF Full Text Request
Related items