Font Size: a A A

Estimating And Checking The Dimension Reduction Models In Regression

Posted on:2007-05-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:L P ZhuFull Text:PDF
GTID:1100360185462372Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Dimension reduction is a leitmotif of statistics. The theory of sufficient dimension reduction (SDR) achieves this goal by replacing the original high dimensional data with a low-dimensional subspace composed of a few linear combinations of predictors without losing any information on the conditional distribution of the response given the predictors and without pre-specifying any parametric model. This thesis studies the sufficient dimension reduction problem in regression scheme.I first study the possibility of reducing the dimensionality sufficiently. For this target, a score type test that is the sum of weight residuals is recommended to test the conditional independence model:Y(?)X|BTXwhere Y is a scalar response, X is a p x 1 predictor vector, B is apx K matrix and "(?)" denotes for independence. The discrepancies are computed from the density functions of Y and of the projections of X at projected directions. It is found that the limit distribution of the test statistic is model free. Moreover, the weight function is implemented to obtain the optimality of this score type test for directional alternatives. Prom the construction of the tests, the maximin test is defined when there are several candidates of alternative models.; The test can detect the alternative distinct a parametric rate from the null. An important application of such tests is its usefulness in determining the structural dimension.When this conditional model holds, a large number of sufficient dimension reduction approaches have been developed that provide estimates of B. Among all dimension reduction methods, the inverse regression of the predictors X on the response Y, rather than the forward regression of Y on X, such as Sliced Inverse Regression (SIR) and Sliced Average Variance Estimation (SAVE), are promising. For evaluating the kernel matrix of SIR, Li (1991) proposed slicing estimation. This method has an important merit of easy implementation. Zhu and Ng (1995) established the asymptotic normality of the slicing estimation of SIR matrix when the number of slices is ranged from a rate n1/2 to n/2. However, further study in Li and Zhu (2004) demonstrates that the asymptotical results are quite different when the slicing estimation is used to estimate SAVE. Therefore, in...
Keywords/Search Tags:Asymptotical Normality, Dimension Reduction, Empirical Likelihood, Heteroscedasticity, Kernel Estimation, Score test, Sliced Inverse Regression, Maximin Test
PDF Full Text Request
Related items