Font Size: a A A

On Kernel Method For Directional Regression Estimation

Posted on:2013-02-21Degree:MasterType:Thesis
Country:ChinaCandidate:Q HengFull Text:PDF
GTID:2210330374467342Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Over the recent years, with the innovation and advancement of science and technology, enormous developments have taken place in high-dimensional data analysis, primarily driven by various applications in many fields such as genomics, signal processing, internet and etc. Dimension reduction is a leitmotif of statistics. Without regard for the assumption of the condi-tional distribution of the response given the predictors or pre-specify ing any parametric model, the theory of sufficient dimension reduction (SDR) achieves this goal by replacing the original high dimensional data with a low-dimensional subspace made up of some linear combinations of predictors without losing any information. It involves two main issues, One is to estimate the directions of the Central (Mean) Dimension Reduction subspace via the basis eigenvectors estimation, another is how to determine the structural dimension. This dissertation is devoted to the study of common methods and methodology development of directional regression (DR) in the literature of sufficient dimension reduction.Like contour regression, DR is derived from empirical directions, but achieves higher ac-curacy and requires consumedly less computation. It combines the advantages of sliced inverse regression and sliced average variance estimation methods, naturally synthesizes the dimension reduction estimators based on conditional moments. The asymptotic distribution of the DR es-timator is derived from a sequential test procedure to determine the dimension of the central reduction space. Under trifling conditions, it provides general and n1/2-consistent estimate of the dimension reduction space.In this paper, the asymptotic properties of the kernel smoothing estimation of directional regression are investigated. For slicing method, how to choose the slice number is still a prob-lem while the kernel smoothing estimation method improves the accuracy of estimation and is easier to implement by using a proper bandwidth. Moreover, we can apply the cross-validation method to determine the length of the bandwidth. Finally, we compare these two methods to es-timate the directions of the CDR subspace in this paper by comprehensive simulation studies to illustrate the efficiency of our proposal. We also demonstrate the use of our proposal convenient through an application in real data analysis.
Keywords/Search Tags:dimensional reduction, directional regression, kernel smoothing estimation, slice, bandwidth, eigenvector
PDF Full Text Request
Related items