Font Size: a A A

A Study On High-Dimensional Regression With Quadratic Measurements

Posted on:2018-02-06Degree:DoctorType:Dissertation
Country:ChinaCandidate:J FanFull Text:PDF
GTID:1310330518989455Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In the era of big data, high-dimensional data appears in such fields, as genome and health science, economics and finance, astronomy and physics, signal processing and imaging, etc. A common feature in high-dimensional data analysis is the sparsity of the predictors and one of the main goals is to select the most relevant variables to accurately predict a response variable of interest. To do that, many statistical methods are developed in the context of sparse linear regression. However, in many problems,e.g., compressive sensing, signal processing and sub-wavelength imaging, the response variables are quadratic functions of the unknown parameters. In this paper we intro-duce a quadratic measurements regression (QMR) model, propose some procedures for variable selection, and establish corresponding optimization theorems and algorithms in high-dimensional case.In Chapter 2, we introduce the concept of uniform regularity, provide some suffi-cient conditions, and show that the identifiability of the high-dimensional QMR model can be characterized by the uniform regularity.In Chapter 3, we study the lq(0 < q < 1)-regularized least squares method and establish the so-called moderate deviations and weak oracle property. To compute the corresponding estimate, we establish a fixed point equation, construct a fixed point algorithm and derive its convergence. We calculate some numerical experiments to demonstrate the finite sample performance of the proposed method.In Chapter 4, we employ l0-constrained least squares method to study our QMR model. We derive a fixed point equation, construct a projected gradient algorithm, and establish its convergence. We calculate some numerical experiments to demonstrate the finite sample performance of the proposed method.In Chapter 5, we consider the weighted l1 penalized quantile regression for the linear model, a special case of QMR. and introduce a fast and novel alternating direction method for computing the corresponding estimate. We establish its convergence and also combine the locally linear technique to propose an algorithlm to compute a type of nonconvex penalized quantile regression. We calculate some numerical experiments to demonstrate the computational efficiency.
Keywords/Search Tags:Quadratic Measurements Regression, Sparsity, Statistical Properties, Optimization Algorithms
PDF Full Text Request
Related items