Font Size: a A A

Research On Hyper-parameter Optimization And Its Application Based On Intelligent Computation

Posted on:2019-07-03Degree:MasterType:Thesis
Country:ChinaCandidate:G LuFull Text:PDF
GTID:2428330572951640Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
With the rapid development of Internet of Things and the massive parallel computing,artificial intelligence(AI)becomes more and more popular from cutting-edge technology.With the assurance of powerful computing ability,the rise of AI can be attributed to two factors: data and machine learning algorithms.How to configure the hyper-parameters of the machine learning algorithms is the key to achieve excellent performance of AI technology in the case of data preprocessing.Hyper-parameters are the parameters that need to be determined before an algorithm runs.For example,in the genetic algorithm,the size of the population is a hyper-parameter.The definition of hyper-parameter optimization is to choose a set of good hyper-parameters in order to get excellent performance for an algorithm.In past decades,scholars in AI tuned the hyper-parameters of machine learning model based on personal experience.In recent years,with the explosive growth of data,machine learning algorithms represented by deep neural networks(DNN)have shown great advantages in handling massive data.But how to choose their many hyper-parameters is still an unsolved problem.This article is to solve this problem from the perspective of optimization.In Chapter 2,firstly,we will introduce several classical machine learning models,optimization algorithms,and modeling procedure of hyper-parameter optimization.We will also introduce some early hyper-parameter optimization methods,such as grid search,random search.Then,for the instability of random search,the first innovation point of this article is proposed: hyper-parameter optimization based on quantum-inspired evolutionary algorithm(QEA).Experimental results show that our method has achieved excellent results in tuning the hyper-parameters of neural network based on MNIST dataset.In Chapter 3,hyper-parameter optimization is essentially an expensive black box function optimization problem,therefore,scholars begin to use Bayesian optimization to tune the hyper-parameters of the machine learning model.In this chapter,we will introduce the Bayesian optimization in detail,and propose the second innovation point of this article: the hyper-parameter optimization based on multiple adaptive regression splines(MARS).Experiments show that the proposed algorithm can improve the time efficiency greatly compared to Bayesian optimization.In Chapter 4,we will propose the third innovation point of this article: application of data driven optimization for change detection in synthetic aperture radar(SAR)images.Specifically,(1)we propose an easy-to-implement thresholding algorithm for change detection in SAR images based on data-driven optimization.Its performance has been compared with commonly used methods like generalized Kittler and Illingworth thresholding algorithms(GKITs).(2)Next,we demonstrate how to tune the hyperparameters of a(previously available)deep belief network(DBN)for change detection using data-driven optimization.Extensive evaluations are carried out using publicly available benchmark datasets.The obtained results suggest comparatively strong performance of our optimized DBN-based change detection algorithm.In addition,in this alticle,intelligent computing refers to evolutionary algorithms,Bayesian optimization,optimization based on multiple adaptive regression splines,etc.
Keywords/Search Tags:hyper-parameter optimization, machine learning, optimization, data driven, change detection on SAR images
PDF Full Text Request
Related items