Font Size: a A A

Research And Application Of Multi-fidelity Based Hyper-parameter Optimization

Posted on:2021-04-20Degree:MasterType:Thesis
Country:ChinaCandidate:X LiuFull Text:PDF
GTID:2428330605976020Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Deep learning and neural networks have developed tremendously in various industries.When using neural networks for calculation and prediction,users need to define various parameters of the network in advance.In the early days,empirical values were used to adjust parameters.This method is inefficient,and the selection of parameters greatly affects the efficiency of calculation.The emergence of Automatic Machine Learning(AutoML)alleviates this problem to a certain extent.This article focuses on the hyperparameter optimization problem.The gradients of hyperparameters are difficult to obtain.In previous studies,black box optimization is a commonly used hyperparameter optimization method,but the evaluation cost is high.Therefore,in the hyperparameter optimization problem,how to greatly reduce the computational cost of black box optimization and perform effective parameter adjustment has become one of the difficulties and hot spots of researchThe nature-inspired optimization is inspired by the behavior or habits of animals and plants in nature.Through the research and modeling of these natural phenomena,a black box optimization algorithm without gradients is proposed.The natural heuristic optimization algorithm has been used in many fields in recent years because it does not require gradients and is easy to jump out of local optimality.It is also suitable for hyperparameter optimization problems.This subject starts systematic research from two aspects:natural heuristic optimization algorithm and multi-fidelity hyperparameter optimization.In terms of natural heuristic optimization algorithms,the general algorithm framework is improved and supplemented by studying new algorithms proposed in recent years.In terms of hyperparameter optimization,a hierarchical optimization method with multiple fidelity levels based on natural heuristic optimization is proposed.And choose the natural heuristic optimization algorithm,after the test,the whale optimization algorithm with better performance was tested and verified.In this paper,we selected four commonly used image classification datasets,and the experiment is carried out with the goal of optimizing the eight hyperparameters of the three-layer convolutional neural network.The results of the search efficiency of the random search method,manual tuning and the multi-fidelity optimization method proposed in this paper are compared.Experimental results show that the multi-fidelity hyperparameter optimization method proposed in this paper can effectively reduce the time cost of the hyperparameter optimization process.
Keywords/Search Tags:Hyper-parameters tuning, multi-fidelity optimization, nature-inspired optimization, convolutional neural network
PDF Full Text Request
Related items