Font Size: a A A

Neural Networks And Their Applications In Sufficient Dimension Reduction: Theory And Method

Posted on:2024-06-16Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y F ChenFull Text:PDF
GTID:1520307301459074Subject:Statistics
Abstract/Summary:PDF Full Text Request
The rapid advancement of technology has greatly facilitated data collection across diverse fields,but it has also led to a relentless increase in data dimensionality.Consequently,there is an urgent demand for effective statistical methods capable of handling large-scale,high-dimensional data.Sufficient dimension reduction,particularly nonlinear sufficient dimension reduction,provides a potent avenue for addressing the challenges posed by high-dimensional data.The primary objective of nonlinear sufficient dimension reduction is to identify a minimal set of functions denoted as f that retains a substantial amount of information regarding Y.Traditional approaches typically employ non-parametric regression techniques such as kernel regression.However,the proliferation of machine learning,notably deep learning,has infused new vigor into the domain of sufficient dimension reduction.Therefore,we contemplate the integration of deep learning into the realm of sufficient dimension reduction.The core content of this paper can be summarized as follows:(1)Traditional deep learning research has predominantly centered on neural networks utilizing Rectified Linear Unit(Re LU)or Sigmoid activation functions.Consequently,we delve into the theoretical underpinnings of Rectified Power Unit(RePU)neural networks,encompassing their approximation capabilities and statistical error analysis.We substantiate that RePU networks excel in precisely modeling polynomial functions and exhibit approximation properties for certain smooth functions.Subsequently,we deploy these networks in the context of least squares regression and logistic regression,elucidating error bounds for these scenarios.(2)The fusion of traditional machine learning algorithms with sufficient dimension reduction constitutes a prominent area of investigation.For instance,support vector machines can be adapted for sufficient dimension reduction with modest adjustments.We generalize the loss function of support vector machines by replacing the hinge loss with a more generalized convex loss,thus introducing a generalized support vector machine loss suitable for sufficient dimension reduction.We then synergize this loss function with neural networks,substantiating its efficacy and asymptotic properties.Empirical validation of practical performance is conducted through numerical simulations.(3)Traditional statistical methodologies have introduced diverse criteria for quantifying independence between variables,including distance covariance and martingale difference divergence.We conceptualize sufficient dimension reduction as a framework for conditional independence and,consequently,integrate these criteria with sufficient dimension reduction techniques.Specifically,we demonstrate the applicability of generalized martingale difference divergence to sufficient dimension reduction and formulate the problem of sufficient dimension reduction as a constrained optimization task.Utilizing mathematical techniques,we transform it into an unconstrained problem and combine it with neural networks,We elucidate convergence properties and validate the practical utility of this approach through comprehensive numerical simulations.
Keywords/Search Tags:Sufficient Dimension Reduction, Neural Network, Martigale Difference Divergence, Statistical Machine Learning, Support Vector Machine, Rectified Power Unit
PDF Full Text Request
Related items