Font Size: a A A

Research Of Automated Hyperparameters Optimization Algorithm

Posted on:2020-11-18Degree:MasterType:Thesis
Country:ChinaCandidate:Q WeiFull Text:PDF
GTID:2518306500983289Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
The traditional data analysis and data mining process generally include these steps,data preprocessing,feature selection,model selection,hyperparameters optimization,and model evaluation.Reasonable selection of data mining algorithms is the core step of data mining.In the process of building models,different data mining algorithms have different types of hyperparameters,the same kind of hyperparameters also have different hyperparameters values.The choice of hyperparameters is crucial to the performance of the model.Therefore,it is necessary to repeatedly optimize these hyperparameters in order to achieve optimal performance of the model.However,the current work of hyperparameters optimization relies on the expertise and relevant experience of data engineers and domain experts.It takes a lot of time and effort to calculate and debug.Therefore,it is especially important to automatize the process of hyperparameters optimization in specific algorithms.This thesis takes SVM,random forest algorithms in machine learning,and convolutional neural network in deep learning as an example,and uses Bayesian Optimization algorithm to automate the process of hyperparameters optimization.Three problems are solved by improving the Bayesian Optimization algorithm.Firstly,using meta-learning to initialize the configuration of Bayesian Optimization algorithm can improve performance and efficiency of machine learning algorithm.Secondly,the improved Relief algorithm is adopted to evaluate the importance of hyperparameters.Thirdly,using Bayesian Optimization algorithm based on Monte Carlo Markov Chain optimizes hyperparameters in the kernel function of Bayesian Optimization algorithm and accelerates the Bayesian Optimization algorithm to automate the calculation of hyperparameters optimization in deep learning.This thesis uses the datasets on open ML platform for related experiment.The main work is as follows:(1)A new algorithm MLI-BO is proposed,which uses meta-learning to initialize Bayesian Optimization configuration.The algorithm can provide a reasonable initial configuration for the Bayesian Optimization algorithm,accelerate the starting and optimization process of the Bayesian Optimization algorithm,and obtain good performance.The analysis results show that compared with the Bayesian Optimization algorithm and the Random Search algorithm,MLI-BO can provide a better initial hyperparameters configuration and faster optimization time for machine learning algorithm.(2)The improved Relief algorithm is used to evaluate the importance of hyperparameters in machine learning algorithms,in order to understand the importance order of different hyperparameters in machine learning,reduce the number of hyperparameters optimization and improve the efficiency of the algorithm.The experimental results show that improved Relief algorithm provides hyperparameters importance order for SVM and random forest algorithms.(3)Using Bayesian Optimization algorithm based on Monte Carlo Markov Chain optimizes hyperparameters in the kernel function of Bayesian Optimization algorithm,which accelerates the calculation speed of Bayesian Optimization algorithm in deep learning.The experimental results show that the error rate of Bayesian Optimization algorithm via Monte Carlo Markov Chain on the MNIST datasets and CIFAR datasets is 3.5% and 24.5%,respectively.The error rate are lower than other hyperparameters optimization algorithm.
Keywords/Search Tags:Automated hyperparameters optimization, machine learning, convolutional neural networks, meta-learning, hyperparameters importance
PDF Full Text Request
Related items