Font Size: a A A

Deep Convolutional Neural Networks Based On Deterministic DropConnect

Posted on:2019-09-01Degree:MasterType:Thesis
Country:ChinaCandidate:H Y LiFull Text:PDF
GTID:2428330593451660Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
As a widely used deep learning algorithm,convolutional neural network has made significant achievements in many fields such as image recognition,speech processing,biomedicine,and so on.With the rapid development of computer technology,the acquisition of big data and high-performance computing promote the development of convolutional neural networks.However,there are still some shortcomings which need to be further studied and overcome.Compared with conventional neural networks,deep convolutional neural networks can learn more abundant and more discriminative features.However,because of large amount of layers and parameters,deep convolutional neural networks are prone to overfit.In order to solve this problem,this thesis presents a deep convolutional neural network algorithm based on deterministic drop-connection.During the training process,the algorithm deterministically selects the weights which have small absolute values and sets such weights(i.e.,small-absolute-value weights)to zero,which increases the sparsity of the network and avoids overfitting to the training data.Thus the proposed method can suppress the overfitting of the deep convolutional neural networks.In the experiments of image classification,the deterministic DropConnect algorithm proposed in this thesis can effectively improve the classification performance of the network,which is superior to other regularization methods.Due to a large amount of parameters and the high computational complexity,deep convolutional neural networks are limited in some resource constrained devices.In the present convolutional neural networks based on single classifier,all the samples need to be through all layers for feature extraction and classified at the last layer.Therefore,there is a problem of wasting of computing resources for some samples which are easy to classify.To solve this problem,this thesis presents a multi-classifier convolutional neural network based on cascaded structure.The simple samples are classified by the first several layers with low-level features for early classification,whereas the complicated samples are classified by the last several layers with high-level features.Thus,the computational complexity can be reduced effectively.Compared with the conventional network with single classifier,convolution neural network based on cascaded structure can effectively reduce the computational complexity by 26.9% under the similar recognition accuracy.The experimental results show that the cascaded multi-classifier structure can effectively reduce the overall computational burden of the network,which can improve the computational efficiency of the network and further promote the application of the deep convolutional neural networks in more fields.
Keywords/Search Tags:Convolutional neural networks, Deterministic DropConnect, Overfitting, Cascaded multi-classifier
PDF Full Text Request
Related items