Font Size: a A A

Surrogate Models And Optimization In Uncertainty Quantification

Posted on:2019-03-10Degree:DoctorType:Dissertation
Country:ChinaCandidate:L YanFull Text:PDF
GTID:1360330611992949Subject:Systems analysis and integration
Abstract/Summary:PDF Full Text Request
With the rapid development of complex equipment systems,the reliability and efficiency of test evaluations is becoming more and more important.The performance evaluation of complex equipment systems usually involves a multidimensional mathematical model,which is generally computational expensive,and the corresponding performance indicators output with uncertainties due to limitation of knowledge.How to access a robust,reliable and efficient performance evaluation and prediction is an urgent task for complex systems.Uncertainty Quantification(UQ)is proposed to solve the above mentioned problem.In fact,UQ is a complete framework of modelling and optimization of complex systems,and it is of great significance to improve the quality and standard of test evaluations by understanding and implementing the corresponding UQ methods.Propagation of Uncertainty(PoU)is a key part of the UQ framework,with building proper surrogate models and proposing suitable design of experiments(DoE)being two key aspects.This work develops new surrogate approaches based on the state-of-the-art polynomial chaos expansion(PCE)and Gaussian processes(GP)regression models.One method utilizes the structural properties of kernels,while the other one focusses on the weight functions of independent surrogates.Furthermore,this work also introduces the K-optimality for the purpose of building stable and accurate surrogate models,with the philosophy of Bayesian optimization(BO).At last,this work studies two cases with real life background,which are solved by corresponding approaches proposed in this work.1.This work solved the problem of automatic truncation of the PCE model,which is achieved by solving a GP model with Mercer kernel constructed by orthogonal polynomials,hence reduced the potential aliasing error efficiently.Firstly,this work examines the theoretical differences between the PCE and GP approximations.The Mercer's theorem is introduced to study the explicit procedures of building isomorphic Reproducing Kernel Hilbert space(RKHS)via the two methods.Furthermore,quantification of the difference between the distributions of predictions is conducted with the Kullback-Leibler(KL)divergence.The mathematical analysis shows that the GP model based on Mercer Kernels,namely the GPCB method in this work,can be regarded as an infinite dimensional PCE method with an adaptive truncation scheme,hence it is capable to deal with the aliasing error.The explicit expressions of Mercer kernels based on commonly used orthogonal polynomials are summarized according the theory of generation function.In order to solve the maximum likelihood problems of the GPCB method,this work modified the DIRECT algorithm to propose a high-efficiency global search algorithm to optimize the corresponding hyper-parameters.When dealing with the high dimensional problems,a random sample scheme on the tensor product of quadrature points is proposed to generate a computational feasible and accurate training data set for the GPCB method.2.This work solved the problem of inaccuracy caused by stand-alone surrogate model in two ways: proposing the robust weighted surrogate model where the weight functions are constructed with the help of KL divergence,or performing the model selection to discovering the additive structure via the MCMC procedure along with GP assumption.Firstly,It is well-known that the classical weighted model usually focus on single statistical feature,such as the moments of prediction.This work,however,defines the variation of KL divergence as an indicator of the information gain of an individual model among the candidate model set.The weight functions based on the variation can always result in more stable outcomes than the classical ones.Secondly,this work solved the model selection with two layers in case that redundant variables exists.In the local layer,where the training data is assumed to be generated by an additive structure,the MCMC algorithm is utilized to sample models from its posterior distribution,and the Bayesian averaging is used as an approximation of the objective function.In the global layer,where the dimension of the model is uncertain,a proposal distribution is given to output the transfer probability between different dimensions,hence a simulated annealing algorithm can be introduced to optimize the conditional entropy,as well as discovering the best dimension.3.This work uses the K–optimality which aims at minimizing the condition number to solve two key problems in BO methods,i.e.the unstable GP prediction due to the ill-conditioned covariance matrix and the difficulty of determining the trade-off parameter between exploitation and exploration.Firstly,the Sequentially Bayesian K-optimal design(SBKO)is proposed to ensure the stability of the GP prediction,where the condition number is set as the acquisition function directly.Theoretical analysis show that the SBKO reduces the integrated posterior variance and maximizes the hyper-parameters' information gain simultaneously,hence it can derive a suitable DoE for prediction task.Secondly,a K-optimal enhanced Bayesian Optimization(KO-BO)approach is given for the optimization problems,where the K-optimality is used to define the trade-off balance parameters which can be tuned automatically.It is notable that such a procedure ensures the robustness of GP prediction and helps to jump out the local optimal.4.This work analyses the superior performance of the UQ methods in two applications.Firstly,the surrogate models are used in the time-consuming Monte Carlo estimation of the probability of collision between space debris.This work shows that the trajectory and velocity of a space object can be reconstructed accurately using a relative small set of observation data.The efficiency of calculation of surrogate models provided the probability of online computation and real time operation.In the accuracy assessment of terminal guidance of active and passive radar system,the equivalent tests are main sources of collecting corresponding data.The BO algorithm is used to create the optimal design of equivalent target,which has minimal difference in radar cross section with the real target.It shows that the optimized target leads to more reliable evaluation results for our task.
Keywords/Search Tags:Bayesian Optimization, Design of Experiments, Gaussian Process, Polynomial Chaos Expansion, Surrogate Model, Uncertainty Quantification
PDF Full Text Request
Related items