Font Size: a A A

Research On Automatic Hyper-parameter Tuning Algorithm

Posted on:2021-04-25Degree:MasterType:Thesis
Country:ChinaCandidate:H N WuFull Text:PDF
GTID:2428330614450006Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
We are living in an era of continuous development and progress of artificial intelligence(AI),as well as an era based on big data.The current role of data in all walks of life is obvious and very important.However,in the era of data explosion and growth,how to reasonably use these data to make our daily life more convenient and more intelligent requires the support of constantly improving computing power and various powerful algorithms in hardware.Algorithms here include machine learning algorithms that are widely used and developing rapidly.With the support of massive data and powerful computing power,machine learning has been continuously developing,and the training process has become more efficient and convenient.On this basis,the hyper-parameters of machine learning algorithms have a great impact on their performance.Therefore,the selection of hyper-parameters is a key step for machine learning algorithms to be practical and an important guarantee for algorithms to achieve excellent performance.Therefore,it is very important to design a good algorithm that can automatically select a set of good hyper-parameters configurations for machine learning models.At present,many algorithms for automatic tuning of hyper-parameters have been proposed.For example,bayesian optimization algorithm based on different substitution functions,including Hyperopt framework based on Tree Parzen Estimator(TPE),SMAC framework based on Random Forest,and Bayesian optimization algorithm based on Gaussian regression process,etc.In addition,there are many ideas through various evolutionary algorithms,such as simulated annealing algorithm,genetic algorithm and particle swarm optimization algorithm,etc.,to realize the hyper-parameters optimization process of machine learning model.In this paper,two kinds of automatic hyper-parameters tuning algorithms are proposed for these two types.The first innovation is to propose a MARS return-based hyper-parameter tuning algorithm,which is the same as the Bayesian optimization algorithm as the data-driven optimization algorithm.Experiments show that compared with The Bayesian optimization algorithm,the algorithm proposed in this paper not only performs the same results as the bayesian optimization algorithm,but also significantly improves the time efficiency.Secondly,the second innovation point of this paper is to propose a hyperparameter automatic tuning algorithm based on quantum genetics,which combines the core ideas of random search and evolutionary algorithm.Experiments show that in the hyper-parameter tuning problem for a variety of traditional machine learning models,the algorithm not only solves the problem of the instability of the random algorithm,but also solves the problem of the slow convergence speed of the general evolutionary algorithm iteration,and good experimental results have been achieved.
Keywords/Search Tags:hyper-parameter tuning, genetic algorithm, quantum genetic algorithm, MARS, machine learning
PDF Full Text Request
Related items