Font Size: a A A

Essays on the optimal selection of series functions

Posted on:2008-07-21Degree:Ph.DType:Dissertation
University:University of California, San DiegoCandidate:Pascual, Francisco LFull Text:PDF
GTID:1440390005954437Subject:Statistics
Abstract/Summary:
We study the asymptotic optimality of model selection procedures for choosing an optimal model under ridge estimation. We focus on series estimators linear in the parameters for approximating the unknown conditional expectation of a target variable given a vector of predictors, which constitutes its optimal forecast under quadratic loss.; How well series estimators perform depends crucially on how optimally the basis or series functions are chosen. In addition, the use of ridge estimation improves the predictive ability of the model (and therefore its approximation capabilities) as long as the shrinkage parameter is optimally chosen.; The first chapter studies the asymptotic optimality (loss-efficiency) of Mallows CL, leave-1-out cross-validation, generalized cross-validation and GIC when the model parameters are estimated using ridge regression. We assume stochastic regressors and iid observations.; The second chapter studies asymptotic loss-efficiency in an environment with dependent and heterogeneous observations and therefore it allows us to deal with financial and macroeconomic data. We consider a generalized version of Mallows CL that we called GC L and which is appropriate under this more general data structure. We also cover leave-1-out cross-validation. As we show there, under error correlation the optimality of these criteria breaks down.; The last chapter analyzes a general version of leave-1-out cross-validation which is known as h-block cross-validation and that is robust to error correlation.
Keywords/Search Tags:Optimal, Leave-1-out cross-validation, Series, Model
Related items