Font Size: a A A

A novel probabilistic lower bound on mutual information applied to a category-based framework for spike train discrimination

Posted on:2010-11-06Degree:Ph.DType:Dissertation
University:University of FloridaCandidate:VanderKraats, Nathan DFull Text:PDF
GTID:1448390002477835Subject:Biology
Abstract/Summary:
We propose a framework by which the information transmission capacity of feedforward networks of spiking neurons can be measured for any discrete set of stimulus classes, or categories, using a statistical classifier to decode stimulus information. Network performance is measured using mutual information (MI), which is estimated using both a naive approach inspired by Fano's inequality and one which levies an intermediate, real-valued data representation common in many classification techniques. We establish the latter through a result of independent interest: a novel probabilistic lower bound on the MI between a binary and a continuous random variable. The bound is derived from the the Dvoretzky, Kiefer, and Wolfowitz inequality for the difference between an empirical cumulative distribution function and the true distribution function.;Our framework is demonstrated on a model of the early human auditory system, using a variety of auditory categories and a phenomenological model of a 400-neuron auditory nerve simulated by the Meddis Inner-Hair Cell model. During classification, in addition to the standard spike count representation for a population of spike trains, we establish the utility of other novel feature spaces. We find that basic networks with random synaptic weights are surprisingly effective at transmitting stimulus information. (Full text of this dissertation may be available via the University of Florida Libraries web site. Please check http://www.uflib.ufl.edu/etd.html)...
Keywords/Search Tags:Information, Framework, Novel, Bound, Spike
Related items