In recent years,deep learning has developed rapidly.Neural networks have been applied to a variety of visual tasks,such as image classification,and have achieved remarkable results.Many excellent and efficient neural network architectures have been designed.The neural network architecture plays a key role in the performance of the model,so people often spend a lot of time on the design of the network architecture.However,as the network architecture becomes more and more complex and the number of parameters continues to increase,the design of the network structure becomes more and more complex,and the task of designing the network structure becomes more arduous.In order to alleviate the above problems,neural network architecture search has gradually been paid attention to.By designing different search algorithms to automatically search for a network architecture with excellent performance in a wide range of search spaces,it can liberate people from complex manual design.This paper mainly explores the search for an effective network structure through the neural network architecture search algorithm to improve the performance in image classification.Differentiable Architecture Search(DARTS)is an efficient search algorithm based on gradient descent.However,DARTS always tends to select operators with no trainable parameters,which will lead to network degradation due to a sharp reduction in the number of parameters.In order to solve this problem,many works focus on a special search operator,but this paper finds a more general law,that is,the depth of the operator will affect the search results.Therefore,this paper proposes a regularization method based on operator depth and complexity,which effectively limits the unfair preference of DARTS in search,prevents network degradation and collapse,and makes network search more stable.On the other hand,due to the limitations of memory and computing resources,many works use a small proxy search space,and then superimpose the searched structure into the final network structure,but this makes the search and test not in the same space,resulting in different continuity.However,this paper redesigns the search space to be able to search and evaluate architectures in the same space,and the network space designed in this paper is different from the search space of previous work,which considers the structure between sub-blocks and increases the global information of the search space.In addition,this paper proposes a new search space with sampling and parameter sharing strategies to reduce resource overhead.This means that this paper can directly search and evaluate in the same space,instead of stacking units with the same structure to form the final network.In the end,this paper achieved an accuracy rate of 97.41% on the CIFAR-10 dataset. |