Font Size: a A A

Energy, entropy and information potential for neural computation

Posted on:1999-07-23Degree:Ph.DType:Dissertation
University:University of FloridaCandidate:Xu, DongxinFull Text:PDF
GTID:1468390014471603Subject:Engineering
Abstract/Summary:
The major goal of this research is to develop general nonparametric methods for the estimation of entropy and mutual information, giving a unifying point of view for their use in signal processing and neural computation. In many real world problems, the information is carried solely by data samples without any other a priori knowledge. The central issue of “learning from examples” is to estimate energy, entropy or mutual information of a variable only from its samples and adapt the system parameters by optimizing a criterion based on the estimation.; By using alternative entropy measures such as Renyi's quadratic entropy, coupled with the Parzen window estimation of the probability density function for data samples, we developed an “information potential” method for entropy estimation. In this method, data samples are treated as physical particles and the entropy turns out to be related to the potential energy of these “information particles.” The entropy maximization or minimization is then equivalent to the minimization or the maximization of the “information potential.” Based on the Cauchy-Schwartz inequality and the Euclidean distance metric, we further proposed the quadratic mutual information as an alternative to Shannon's mutual information. There is also a “cross information potential” implementation for the quadratic mutual information that measures the correlation between the “marginal information potentials” at several levels. “Learning from examples” at the output of a mapper by the “information potential” or the “cross information potential” is implemented by propagating the “information force” or the “cross information force” back to the system parameters. Since the criteria are decoupled from the structure of learning machines, they are general learning schemes. The “information potential” and the “cross information potential” provide a microscopic expression for the macroscopic measure of the entropy and mutual information at the data sample level. The algorithms examine the relative position of each data pair and thus have a computational complexity of O(N2).; An on-line local algorithm for learning is also discussed, where the energy field is related to the famous biological Hebbian and anti-Hebbian learning rules. Based on this understanding, an on-line local algorithm for the generalized eigendecomposition is proposed.; The information potential methods have been successfully applied to various problems such as aspect angle estimation in synthetic aperture radar (SAR) imagery, target recognition in SAR imagery, layer-by-layer training of multilayer neural networks and blind source separation. The good performance of the methods on various problems confirms the validity and efficiency of the information potential methods.
Keywords/Search Tags:Information, Entropy, Methods, Energy, Estimation, Neural
Related items