| In recent years,with the deepening of deep learning in computer vision tasks,structures of convolutional neural network(CNN)becoming increasingly complex,requiring higher computational and storage capabilities from computing platforms.Designing lightweight CNNs become critical in solving the above problems,which reduces parameters and computational complexity of models,as well as lower the model’s dependence on hardware.Besides,compared to the method of manually designing network structures,which requires that designers have extensive experience,neural architecture search(NAS)technology automatically searche for network structures with less prior knowledge.However,NAS consuming extensive computing resources and time since evaluates abundant candidate architectures.This thesis focus on search space,search strategy,and performance evaluation strategy to relieve the problem of abundant memory usage and slow search speed in traditional NAS algorithms.This thesis has three main works:1.A lightweight CNN,DAS-Net(Densely Connected Network for Architecture Search),and an adaptively downsampling search algorithm for densely connected network.Based on dense connections,DAS-Net increases the depth of learned group convolution and composes bottleneck structure to enhance the learning ability of groups’ information and reduce the complexity of network.Moreover,DAS-Net introduces the squeeze-and-excitation module to learn information between channels and summarizes complex features.To reduce the complexity of DAS-Net,an adaptively downsampling search algorithm is proposed,which sparsify the dense connection matrix to reduce redundant connections while ensure network’s performance.Compared with other networks,DAS-Net has lower parameter and computational complexity,performs better in the CIFAR and ILSVRC2012 datasets.2.A Low Memory Densely Connected Differentiable Architecture Search(LMDDARTS)algorithm,to relieve the problem of abundant memory usage and slow search speed of NAS during the gradient optimization process.The thesis proposes an continuous strategy based on weights redistribution to increase the updating speed of the optional operations’ weights during the search process.To updating the network architecture by gradient descent,this method continuousizes the discrete search space,and lower the influence of low-weight operations on classification results to reduce the number of search.Additionally,the thesis designs a dynamic sampler to prune operations which perform poorly during the search process,reducing memory consumption and the complexity of single searches.On this basis,this thesis propose LMD-DARTS based on adaptively downsampling NAS search space.Experimental results show that LMD-DARTS reduces search time by 20% and reduces the memory consumption of the NAS algorithm,and the lightweight CNNs obtained through this algorithm have good classification accuracy.3.A NAS system based on PyQt5,using LMD-DARTS,which can be run on PC.Using this system to search lightweight CNN on the dataset of household garbage.The CNN obtained by the search is deployed on an embedded development board to design a garbage classification system,verifying the effectiveness of LMD-DARTS.This thesis has 32 figures,17 tables,and 81 references. |