Font Size: a A A

Research On Handwritten Digit Recognition Based On Improved Algorithm Of Neural Network

Posted on:2021-05-18Degree:MasterType:Thesis
Country:ChinaCandidate:M G GuoFull Text:PDF
GTID:2428330614464230Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
With the continuous development of deep learning,handwritten digit recognition technology has been widely used in practical production and life such as banking,education,medical treatment,and postal services.However,in many application fields,the operation process still completely depends on humans to complete the task of handwritten digit recognition,which consumes a lot of manpower and material resources.Although many scholars have made many achievements in the research of handwritten digit recognition,with the improvement of computer hardware equipment conditions and the continuous application of machine learning,the handwritten digit recognition technology based on convolutional neural network recognizes the speed,accuracy,The standard of requirement for misunderstanding rate is also gradually increasing.To this end,how to further improve on the basis of the existing research results of handwritten digit recognition,and rely on the computer to greatly simplify the task of completing handwritten digit recognition,save more resources,improve work efficiency,and become a convolutional nerve An important research topic in the classification of network images.This paper uses the Tensorflow machine learning framework and the AlexNet convolutional neural network to deal with handwritten digit recognition related issues.In the AlexNet convolutional neural network,the Swish activation function involves a large number of parameters in the back propagation of the error gradient,which results in a large amount of computation,a slow convergence rate,and a zero derivative of the ReLU activation function in the x negative interval.The problem that the negative gradient is set to zero and the neurons may not be activated raises a new activation function ReLU-Swish.Through the test training comparison and analysis results,the Swish activation function is less than zero and the ReLU activation function is greater than zero to form a piecewise function,and the test comparison experiment is carried out through two data sets CIFAR-10 and MNIST.The experimental results show that the ReLU-Swish activation function has a significant improvement in the convergence speed and the accuracy of the model test training compared with the Swish activation function and the ReLU activation function.The deficiency of the AlexNet convolutional neural network in terms of the power consumption ratio is proposed.Heterogeneous CPU + GPU collaborative computing model,in the model calculation process,the CPU is responsible for logically strong transaction processing and serial computing,enabling the GPU to perform highly threaded parallel processing tasks.Through experimental tests compared with single GPU training and single CPU training,the experimental results show that the heterogeneous CPU + GPU computing model is more excellent in the performance ratio;for LRN(Local Response Normalization)there is no learning parameter,which is proposed.WN(Weight Normalization)replaces LRN,and WN is placed after all the pooling layer,which improves the accuracy of AlexNet model training.Secondly,through comparative analysis of Adam,RMSprop,and Momentum,the three optimizers are learning differently.The effect of the rate on the training of the AlexNet model,and the corresponding optimization interval of the learning rate is obtained,which improves the accuracy of AlexNet's selection of the learning rate range of the Optimizer.
Keywords/Search Tags:Handwritten digit recognition, convolutional neural network, Activation function, Normalization
PDF Full Text Request
Related items