Font Size: a A A

High-dimensional statistical learning and nonparametric modeling

Posted on:2011-04-13Degree:Ph.DType:Thesis
University:Princeton UniversityCandidate:Feng, YangFull Text:PDF
GTID:2440390002965902Subject:Biology
Abstract/Summary:
High-dimensional data analysis has become increasingly frequent and important in various fields of sciences, engineering, and humanities, ranging from computational biology and health studies to financial engineering and risk management. This thesis addresses two important problems in high-dimensional data: estimation of network structure (precision matrix) among the variables and variable selection in ultrahigh additive models as well as the Cox's proportional hazards model. For network estimation in graphical models, we introduce non-concave penalties such as SCAD and the adaptive LASSO penalty to attenuate the bias problem. Simulation experiments and asymptotic theory are used to justify the proposed methods. In the second part, we extend the correlation learning proposed in Fan and Lv (2008) to marginal nonparametric learning in the ultrahigh sparse additive model as well as the Cox's proportional hazards model. The corresponding iterative versions are also provided to further reduce the false selection rate. Under the nonparametric additive models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy the sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. In addition, the thesis proposes a unified family of parametrically-guided nonparametric estimation schemes with theoretical justification and numerical demonstration. This combines the merits of both parametric and nonparametric approaches and enables us to incorporate prior knowledge.
Keywords/Search Tags:Nonparametric, Model
Related items