Font Size: a A A

Ridge-type Estimation Of The Direction In Generalized Single-index Models

Posted on:2011-07-08Degree:MasterType:Thesis
Country:ChinaCandidate:T WangFull Text:PDF
GTID:2120360305999278Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
In this paper, we mainly studies the large sample theory of some direction estimator in generalized single-index models with diverging number of correlated predictors. In addition, we briefly discuss the variable selection of the index direction.The OLS estimator is widely used in linear models. However, when the design matrix is less than full rank or when the predictors are close to collinear, the OLS performs poorly, hence follows ridge regression. Better estimation can be achieved on the average in terms of mean squared error through the bias-variance trade-off.In high dimensional settings, however, the linear model or any other parametric model is at best an approximation to the true one, and the check for an adequate model is not easy. To reduce unnecessary modeling bias and to avoid the curse of dimensionality, we in this paper consider the direction estimation and variable selection in generalized single-index models. Based on the idea of sufficient dimension reduction and ridge regression, we propose a ridge-type direction estimator to deal with both the multicollinearity and high dimensionality in the presence of the unknown link function. The asymptotics, including the strong consistency and asymptotic normality, are investigated when the number of predictors diverges. And we present a variable selection procedure in generalized single-index models with diverging number of predictors. Comprehensive simulations and an empirical study on real data are reported to demonstrate the performance of the proposed method.
Keywords/Search Tags:High dimensionality, Multicollinearity, Ridge regression, Single-index models, Sufficient dimension reduction, Variable selection
PDF Full Text Request
Related items