Font Size: a A A

Estimation, information and neural signals

Posted on:1999-08-20Degree:Ph.DType:Thesis
University:Harvard UniversityCandidate:Twum-Danso, Nanayaa TwumwaaFull Text:PDF
GTID:2468390014967864Subject:Mathematics
Abstract/Summary:
It has been known for about a century and a half that neurons communicate with each other, over long distances, transmitting sequences of spikes. A prevalent assumption in the artificial neural network literature, over the past fifty years, has been that the relevant information in the neural spike train is the spike count per unit interval. Thus, in the majority of artificial neural network models, the neuronal output is modelled as a real-valued signal, where the real number is interpreted as a spike count.;In this thesis, models for neural coding which are a step closer to real neural networks are developed and analysed. A spike train is modelled as a counting process, where the jump times of the counting process correspond to the occurrence times of the spikes. Neural subsystems are modelled as systems which transform stochastic processes, such as diffusion processes, into counting processes. Coding schemes inspired by neurobiology are considered. Specifically, models are proposed for the population coding scheme used in sensory and motor representation, and a stochastic differential equation model which approximates the integrate and fire model of a neuron is developed. The latter model is called a stochastic integrate and fire model.;The population code models are analysed in the context of stochastic filtering theory and information theory. For a special class of these models, a finite-dimensional dynamical filter is developed which computes the minimum mean squared error estimate of the unobserved input process from the observed collection of counting processes. Analytical results are established for the mutual information between the input process and the collection of counting processes. The dependence of the filtering error and mutual information on the amplitude and width of the tuning curves is extensively investigated. It is shown that for a fixed amplitude of the tuning curves, there is an optimal width of the tuning curves for minimizing the filtering error, and for maximizing the mutual information. The optimal width of the tuning curve is shown to increase with the bandwidth of the power spectrum of the input process.;In the analysis of the stochastic integrate and fire model, an upper bound is established for the maximum mutual information between the input and output counting processes. For the non-leaky version of the model, it is shown that if the jump times of the input counting processes are synchronized with the jump times of a Poisson process of appropriate rate, the maximum mutual information can be achieved.
Keywords/Search Tags:Information, Neural, Jump times, Process, Integrate and fire model
Related items