Font Size: a A A

Regularization techniques for linear regression with a large set of carriers

Posted on:1999-01-17Degree:Ph.DType:Thesis
University:University of WashingtonCandidate:Sardy, SylvainFull Text:PDF
GTID:2460390014469305Subject:Statistics
Abstract/Summary:
In the standard linear regression setting, the Least Squares estimate is the best linear unbiased estimate. However, this does not imply that the Least-Squares estimate minimizes the Mean Squared Error. It is possible to decrease the variance component of the Mean Squared Error, at the cost of introducing a bias, leading to an overall reduction of the Mean Squared Error. This is called regularization. In this thesis, we study two applications of linear regression where the number of predictor variables is typically as large or larger than the number of observations: the calibration problem in Chemometrics and expansion-based nonparametric function estimation. Both applications call for regularization. We propose three new regularization techniques for the Chemometrics calibration problem, namely Soft Principal Component Regression, Hard Partial Least Squares and Soft Partial Least Squares. We then consider nonparametric function estimators based on wavelet expansion. We propose a new Block Coordinate Relaxation method to solve the difficult Basis Pursuit optimization problem. We also investigate robust wavelet-based estimators, and solve the corresponding optimization problems with an Interior Point algorithm.
Keywords/Search Tags:Linear regression, Least squares, Regularization, Mean squared error
Related items