Font Size: a A A

Research On Structure Optimization Method Of Lightweight Convolution Neural Network

Posted on:2023-12-28Degree:MasterType:Thesis
Country:ChinaCandidate:Z H LvFull Text:PDF
GTID:2568306788463874Subject:Software Engineering Technology
Abstract/Summary:PDF Full Text Request
In recent years,the lightweight convolutional neural network has shown significant advantages in the field of image classification,which not only improves the operation efficiency of the model,but also can be used in small mobile devices.However,existing researches mainly focus on the lightweight design of convolutional structures,ignoring the optimization of other general modules in the model,such as the squeeze and excitation module.Secondly,since the pointwise convolution structure in the lightweight convolutional neural network contains a large number of parameters and consumes too many computing resources,it is necessary to design the corresponding lightweight module for optimization.Aiming at the above two problems,this thesis conducts research on lightweight convolutional neural network,and the achievements are as follows:1 、 Based on multi-dimensional squeeze and excitation module,lightweight height and width squeeze excitation modules are proposed.In this method,the attention mechanism module is directly embedded into the lightweight model to improve the ability to capture the region of interest.Firstly,in order to enhance the expression ability of feature information,in accordance with the implementation principle of squeeze excitation module,the feature processing operations with two dimensions of height and width is added in the pooling operation part,and the SE-HW attention mechanism module is proposed.Secondly,in order to reduce the parameters and computation of the embedded attention mechanism module,the squeeze and excitation operations were carried out in the height and width dimensions of the feature graph respectively,and the lightweight height squeeze excitation module HD-SE and the width squeeze excitation module WD-SE were proposed.Finally,SE-HW,HD-SE and WD-SE modules are embedded into Mobile Net V1 and Mobile Net V2 models,respectively.Experimental results on food-101,CIFAR10 and CIFAR100 datasets show that the proposed method is effective compared with other attention mechanism modules.2 、 Pointwise convolution optimization module based on Ghost and a feature-oriented redundancy loss function are proposed.This method optimizes the lightweight pointwise convolution structure,reduces the parameters and computation,and modifies the cross entropy loss function to further improve the feature extraction capability of the model.Firstly,in order to enhance the channel information of feature images generated by cheap convolution,the Ghost-Point module is proposed to carry out average pooling and deep convolution operation for feature images generated by1×1 main convolution.Secondly,in order to enhance the redundancy of Ghost-Point module’s generated features,this thesis constructs nonlinear redundancy coefficient based on L2 norm variance of filter for reconstructing cross entropy loss function,and proposes Abunt Loss function.Finally,Mobile Net V1 and Mobile Net V2 models are optimized by Ghost-Point module and Abunt Loss function.Experiments on food-101,CIFAR10 and CIFAR100 datasets show better classification results compared with other methods.Furthermore,the Mobile Net V1 model is optimized by integrating SE-HW,Ghost-Point module and Abunt Loss function.Experimental results on food-101 dataset show that the proposed method is effective.This paper has 30 figures,29 tables,and 63 references.
Keywords/Search Tags:lightweight convolutional neural network, multi-dimensional squeeze and excitation, Ghost module, redundancy loss function
PDF Full Text Request
Related items