Font Size: a A A

Research On Artificial Neural Network Structure Optimization Method

Posted on:2020-03-23Degree:MasterType:Thesis
Country:ChinaCandidate:X FengFull Text:PDF
GTID:2428330575498894Subject:Engineering
Abstract/Summary:PDF Full Text Request
The structure of neural network directly determines its performance and efficiency.The structural optimization of the neural network is not only a hot issue in the field,but also a key step that cannot be overcome in engineering applications.As the network depth increases,the structural optimization difficulty of the neural network increases,so solving this problem has theoretical and practical significance for neural network design and application.Based on information theory and statistical theory,this thesis attempts to define different pseudo entropy to reflect the ability of neurons to load data in artificial neural networks.It also uses sequence model to carry out structural serialization modeling of deep convolutional neural networks,combined with optimization algorithms.The model training efficiency not only realizes the adaptive optimization of the neural network structure,but also enhances the universality of the neural network and promotes the development of the neural network.The main work of this paper includes:1.BP neural network and restricted boltzmann machine structure optimization algorithm design.Feedforward neural networks and restricted boltzmann machines are two important models of artificial neural networks,which provide important training mechanisms and optimization methods for the rapid development of deep learning today.For these two models,we study the structural optimization method of the fully connected network,use information entropy and mean square error to describe the local and global working state of the network,adjust the network structure according to the working state,and make the model structure tend to be stable.Because of the correlation between these two models and the deep belief net,this paper simultaneously implements and designs experiments to verify the structural self-organization optimization algorithm of deep belief network.2.Optimization algorithm design of deep convolutional neural network model structure.The structural optimization of deep convolutional neural networks is more difficult than traditional networks.The main reason is that the convolutional neural network has more parameters and the network components are often abundant.In addition to the parameters of the convolutional layer and the pooling layer,there are many network components such as Dropout,Batch Normalization,Softmax,and Dense layers to form a deep network model.Therefore,the use of spatial sequence modeling for component selection in deep learning model is a relatively good structural optimization method.This paper uses LSTM network as the controller for component selection and connection.3.Experimental design of neural network optimization algorithm.Due to the different working principles between the models,we designed and verified the actual working effect of the structure optimization algorithm in the corresponding experiments,and compared with the effect of the recently proposed optimization algorithms.For classification tasks,the MNIST,CIFAR-10,CIFAR-100,and Cats Vs Dogs datasets are used to verify the model's capabilities;for regression tasks,the fit of the nonlinear system model is more reflective of the model's capabilities.
Keywords/Search Tags:Neural network, information entropy, sequence modeling, adaptive structure
PDF Full Text Request
Related items