Font Size: a A A

Research On Kernel Functions Based On Bayesian Optimization Methods

Posted on:2020-08-28Degree:MasterType:Thesis
Country:ChinaCandidate:J K TianFull Text:PDF
GTID:2518306548494454Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Bayesian optimization is a derivative-free optimization method which is widely developed and has strong theoretical support.It is widely used in the field of automatic machine learning.Combining Bayesian Optimization with an automatic machine learning model can obtain higher sampling efficiency than other methods.The Gaussian process is the most commonly used probabilistic model in Bayesian optimization.The kernel function of the Gaussian process will deeply affect the performance of the Gaussian process.However,current kernel functions can not effectively explore the internal patterns of data and do not have the ability of extrapolation.Therefore,it is an urgent research direction to deeply understand and improve the design of kernel function,which is of great significance to the information processing in the era of big data.Given the above problems,this paper studies and analyzes the kernel function of the Gaussian process,and puts forward the viewpoint of studying the properties of kernel function from the spectral domain of kernel function by constructing hierarchical Bayesian models.The main work and innovation of this paper are as follows:1.The kernel function in the Gaussian process is studied and analyzed deeply,and the viewpoint of studying the properties of kernel function from the spectral domain of kernel function is put forward.Theoretical analysis shows that the expansion of kernel function in the spectral domain can build kernel functions with stronger expression ability.The kernel function can effectively reflect the periodicity of the Gaussian process and has the ability of extrapolation.At the same time,the extended kernel function has closed form expression,which is conducive to the construction and application of kernel function.2.A hierarchical Bayesian model is constructed.In this model,all variables are divided into three categories: hyperparameters,global variables,and local variables.And conjugate prior is set for each parameter so that the posterior distribution of each parameter can be expressed in closed form.This paper deduces the reasoning process of the hierarchical Bayesian model completely,which provides a solid theoretical basis for the subsequent model implementation.3.An infinite mixture kernel function(IM)based on the Dirichlet process mixture model is proposed.In the process of implementation,we use the MCMC method to avoid the problem of falling into a local minimum which often occurs in derivative-based methods.When using the model,the number of Gaussian components is automatically determined by the data provided,which reduces the requirements of professional knowledge,and is also conducive to customizing the kernel function for specific scenes.
Keywords/Search Tags:Bayesian optimization, kernel function, Gaussian process, Dirichlet process, nonparametric Bayesian model
PDF Full Text Request
Related items