| In the past decade,with the rapid development of artificial intelligence,machine learning,including deep neural networks,have been widely used in many industries.In practice,the hyper-parameter of the model need to be specified before training each model.To tuning the hyper-parameter of the model,we often find the best value manually according to the existing experience,which is not only expensive,but also not be optimal.The use of Bayesian optimization algorithm alleviates this problem.This paper focuses on the application of Bayesian optimization algorithm in the hyper-parameter optimization of machine learning model.Bayesian optimization algorithm is a black box problem for the hyper-parameter optimization process of machine learning model.In the current research on Bayesian optimization algorithm,there are many inadequacies,such as high computational complexity,sub-optimization,and so on.Therefore,how to reduce the computational complexity,avoid the sub-optimization problem and obtain the optimal hyper-parameter with less computational cost has become the focus of research.In this paper,the application of Bayesian optimization algorithm in the field of hyper-parameter optimization is studied.The main work is as follows:Firstly,based on the particle swarm optimization algorithm,the BO-PSO algorithm without calculating the gradient is proposed,which avoids the calculation of matrix on the Gaussian process.On different public data sets,the hyper-parameter of three mainstream models are confirmed by BO-PSO method,and the effectiveness of the algorithm in classification task and regression task is verified.Secondly,making use of the characteristics of the kernel function in the Gaussian process,an improvement is made on the BO-PSO algorithm,and the IBO-PSO algorithm is proposed,which avoids the repeated calculation and sub-optimization problems caused by the approximate rounding of hyper-parameters in the BO-PSO algorithm.Combined with the actual data set,the integer hyper-parameter of these models are optimized by IBO-PSO method on more complex deep learning models,and the effectivenss of the algorithm is verified by an example.Thirdly,making use of the characteristics of the computer multi-core processing and segmenting the hyper-parameter search space,the time complexity of the IBO-PSO algorithm is improved,and the better hyper-parameter are found in a shorter time.Then,on the deep learning model,the parallel I-BO-PSO method is used to optimize the hyper-parameter of a model.By comparing the running time of the algorithm and the final optimization results,the efficiency of the algorithm is verified.Finally,the results show that through the above improvement of the Bayesian optimization algorithm,the computational complexity is reduced,the repeated evaluation and sub-optimization problems are improved,and the proposed algorithm achieves good results on both the open theoretical data set and the actual data set.The hyper-parameter are selected efficiently and accurately,the dependence on labor is reduced,the consumption of computing resources is reduced,the time cost is saved,and the model learning efficiency and automation are improved. |