Font Size: a A A

A General Framework of Large-Scale Convex Optimization Using Jensen Surrogates and Acceleration Techniques

Posted on:2017-06-13Degree:Ph.DType:Dissertation
University:Washington University in St. LouisCandidate:Degirmenci, SoysalFull Text:PDF
GTID:1448390005462817Subject:Applied Mathematics
Abstract/Summary:
In a world where data rates are growing faster than computing power, algorithmic acceleration based on developments in mathematical optimization plays a crucial role in narrowing the gap between the two. As the scale of optimization problems in many fields is getting larger, we need faster optimization methods that not only work well in theory, but also work well in practice by exploiting underlying state-of-the-art computing technology.;In this document, we introduce a unified framework of large-scale convex optimization using Jensen surrogates, an iterative optimization method that has been used in different fields since the 1970s. After this general treatment, we present non-asymptotic convergence analysis of this family of methods and the motivation behind developing accelerated variants. Moreover, we discuss widely used acceleration techniques for convex optimization and then investigate acceleration techniques that can be used within the Jensen surrogate framework while proposing several novel acceleration methods. Furthermore, we show that proposed methods perform competitively with or better than state-of-the-art algorithms for several applications including Sparse Linear Regression (Image Deblurring), Positron Emission Tomography, X-Ray Transmission Tomography, Logistic Regression, Sparse Logistic Regression and Automatic Relevance Determination for X-Ray Transmission Tomography.
Keywords/Search Tags:Optimization, Acceleration, Framework, Jensen
Related items