Font Size: a A A

Infinite Dimensional Statistical Neural Manifold

Posted on:2007-11-30Degree:MasterType:Thesis
Country:ChinaCandidate:X Z ChenFull Text:PDF
GTID:2178360185494415Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Artificial Neural Networks (ANN) are the most common entity of artificial intelligence (AI) capable of performing a multitude of tasks. Usually, single ANNs may fail to capture the features of a given task accurately and comprehensively, hence ANN ensembles are proposed as a network architecture that combines often a finite number of individual ANNs and produce the final outputs.At the same time, with increase in efforts on the mathematical foundation of NN models and in their real world applications, statistical neural manifolds ([4], [5], [6]) have been proposed to study the overall characteristics of certain class of neural networks and properties of information flow between NNs. Recent frontiers in NN study have been focusing on achieving a unified mathematical description of all statistical NN models while attempting to provide a depiction of the behavior of mass artificial neurons. This thesis shows efforts such as: research on optimal network architecture and consistency ([7]) of certain induction principles, presentation and related calculations on Fisher information matrix on statistical neural manifolds, attempts to unify mathematically all statistical neural network models. To be explicit, works of creativity contained here include:1) Proposition of"Dynamic component selection mechanism"suitable for networks or ensembles bearing an overall three-layer topological network architecture, and providing the"Necessary and sufficient condition for the existence, uniqueness and stability of the solution to the statistically optimal output combining weights of ensembles."2) Providing"Block representation of the Fisher information matrix on neural manifolds of full-parametrised MLPs", as well as"explicit presentations of its key blocks and their inverses."3) Proposition of"Consistent natural gradient learning method on neural manifolds of sigmoidal MLPs ( [8])", based on the finiteness results on the VC-dimension ([8], [9]) of all sub-exponential sigmoidal networks) [8].4) Proposition of"Infinite dimensional statistical neural manifold"as the unified description of all finite dimensional neural manifolds.
Keywords/Search Tags:Infinite Dimensional Statistical Neural Manifold, Dynamic Component Network Selection mechanism, Fisher Information Matrix on Statistical Neural Manifolds, Consistent Empirical Risk Minimiztion Induction Principle, Consistent Natural Gradient
PDF Full Text Request
Related items