Font Size: a A A

Sequential Adaptive Sampling Frame For Global Optimization

Posted on:2021-05-13Degree:MasterType:Thesis
Country:ChinaCandidate:Y XiaoFull Text:PDF
GTID:2427330605957317Subject:Applied Statistics
Abstract/Summary:PDF Full Text Request
Efficient global optimization(EGO)algorithm is a widespread sequential design of experiments for expensive black-box optimization,by sequentially maximizing the expected improvement(EI)which is derived from a surrogate model.And for design of experiments,"sequential number-theoretic method for optimization"(SNTO),or named as "sequential uniform design”,is a zoom-in heuristic strategy for global optimization,which iteratively searches the global optimum with a series of number-theoretic nets or uniform designs.A well-recognized weakness of the original EGO is that it is serial(one ad-ditional point per iteration).A class of methods,called parallel EGO,involved with multiple points per cycle was proposed.However,the computation burden of targeting those points becomes another pain point.In this work,a novel way is put forward to promote the efficiency of original EGO,which is called "Acceler-ated EGO".This algorithm is analog to the parallel EGO,where multiple points with high EI values are augmented per cycle.The points are sampled by a refined sampling/importance resampling(SIR)method,rather than the time consuming optimization method used by parallel EGO,hence the computation burden is eased a lot.As for SNTO,the obvious drawback is that without building a surrogate model,the information of samples is not deeply mined.And the existing runs in the new experimental subspace are discarded.In this paper,based on the idea of SNTO and surrogate model,another algorithm,named "EI-assisted SNTO",is proposed for global optimization.The EGO's strategy of maximizing El is borrowed to locate the zoom-in center.And thanks to the surrogate model built in the experimental space(or subspace),a method employed the importance sampling is proposed to augment the experimental runs per cycle,hence this algorithm is suitable for the expensive black-box optimization.The efficiency of the proposed algorithms is validated by several classic test functions with different dimensions.The empirical results show that the Accelerated EGO algorithm indeed can parallelize original EGO,and gain much improvement compared against the parallel EGO algorithm,“Constant Liar",especially under high-dimensional case.The simulations also show that EI-assisted SNTO algorithm can save a large number of runs comparing with original SNTO,and it can perform better under low-dimensional case compared with other algorithms based on EI criterion.Additionally,we also apply all the algorithms above to the hyper-parameter tuning of machine learning model.The Support Vector Machine(SVM)model is employed to compare the ability of the global optimization algorithms.And the pro-cess of hyper-parameter tunning is illustrated by the XGBoost model.All the results show that the proposed algorithms are suitable for hyper-parameter optimization.
Keywords/Search Tags:Global Optimization, Expected Improvement, Sequential Adaptive Design, Importance Sampling, Hyper-parameter Optimization
PDF Full Text Request
Related items