Random Models in Nonlinear Optimizatio | Posted on:2018-02-08 | Degree:Ph.D | Type:Thesis | University:Lehigh University | Candidate:Menickelly, Matt | Full Text:PDF | GTID:2470390020456287 | Subject:Industrial Engineering | Abstract/Summary: | | In recent years, there has been a tremendous increase in the interest of applying techniques of deterministic optimization to stochastic settings, largely motivated by problems that come from machine learning domains. A natural question that arises in light of this interest is the extent to which iterative algorithms designed for deterministic (nonlinear, possibly non-convex) optimization must be modified in order to properly make use of inherently random information about a problem. This thesis is concerned with exactly this question, and adapts the model-based trust-region framework of derivative-free optimization (DFO) for use in situations where objective function values or the set of points selected by an algorithm to be objectively evaluated are random.;In the first part of this thesis, we consider an algorithmic framework called STORM (STochastic Optimization with Random Models), which as an iterative method, is essentially identical to model-based trust-region methods for smooth DFO. However, by imposing fairly general probabilistic conditions related to the concept of fully-linearity on objective function models and objective function estimates, we prove that iterates of algorithms in the STORM framework exhibit almost sure convergence to first-order stationary points for a broad class of unconstrained stochastic functions. We then show that algorithms in the STORM framework enjoy the canonical rate of convergence for unconstrained non-convex optimization. Throughout the thesis, examples are provided demonstrating how the mentioned probabilistic conditions might be satisfied through particular choices of model-building and function value estimation.;In the second part of the thesis, we consider a framework called manifold sampling, intended for unconstrained DFO problems where the objective is nonsmooth, but enough is known a priori about the structure of the nonsmoothness that one can classify a given queried point as belonging to a certain smooth manifold of the objective surface. We particularly examine the case of sums of absolute values of (non-convex) black-box functions. Although we assume in this work that the individual black-box functions can be deterministically evaluated, we consider a variant of manifold sampling wherein random queries are made in each iteration to enhance the algorithm's "awareness" of the diversity of manifolds in a neighborhood of a current iterate. We then combine the ideas of STORM and manifold sampling to yield a practical algorithm intended for non-convex ℓ1-regularized empirical risk minimization. | Keywords/Search Tags: | STORM, Random, Manifold sampling, Optimization, Models, Non-convex | | Related items |
| |
|