Font Size: a A A

Trading risk and performance for engineering design optimization using multifidelity analyses

Posted on:2010-02-03Degree:Ph.DType:Thesis
University:Stanford UniversityCandidate:Rajnarayan, Dev GorurFull Text:PDF
GTID:2442390002487000Subject:Statistics
Abstract/Summary:
Computers pervade our lives today: from communication to calculation, their influence percolates many spheres of our existence. With continuing advances in computing, simulations are becoming increasingly complex and accurate. Powerful high-fidelity simulations mimic and predict a variety of real-life scenarios, with applications ranging from entertainment to engineering. The most accurate of such engineering simulations come at a high cost in terms of computing resources and time. Engineers use such simulations to predict the real-world performance of products they design; that is, they use them for analysis. Needless to say, the emphasis is on accuracy of the prediction. For such analysis, one would like to use the most accurate simulation available, and such a simulation is likely to be at the limits of available computing power, quite independently of advances in computing.;In engineering design, however, the goal is somewhat different. Engineering design is generally posed as an optimization problem, where the goal is to tweak a set of available inputs or parameters, called design variables, to create a design that is optimal in some way, and meets some preset requirements. In other words, we would like modify the design variables in order to optimize some figure of merit, called an objective function, subject to a set of constraints, typically formulated as equations or inequalities to be satisfied. Typically, a complex engineering system such as an aircraft is described by thousands of design variables, all of which are optimized during the design process. Nevertheless, do we always need to use the highest-fidelity simulations as the objective function and constraints for engineering design? Or can we afford to use lower-fidelity simulations with appropriate corrections?;In this thesis, we present a new methodology for surrogate-based optimization. Existing methods combine the possibility erroneous predictions of the low-fidelity surrogate with estimates of the error in those predictions, to synthesis a figure of promise. In contrast, we propose treating those predictions, and the concomitant uncertainties in them, as independent quantities encapsulating the conflicting objectives of seeking designs with good performance and low risk. We then use multiobjective optimization methods to optimize these objectives simultaneously, in order to answer the question of what designs we will evaluate next using the high-fidelity analysis. We show that this approach renders the design process robust to modeling errors and parameters. In addition, our method generates multiple candidate designs at the end of every search for promising designs. In spite of this, the new search method is no more expensive than existing methods. Moreover, this set of promising designs offers a way to examine various interesting regions of the design space, and this ability suggests a useful visualization and diagnostic tool.;We present numerical experiments that compare our method against existing techniques on both analytic test problems as well as applications in aerodynamics. In all cases, we find that performance is better than that of the best existing methods in terms of both robustness and efficiency.
Keywords/Search Tags:Engineering design, Performance, Existing methods, Optimization
Related items