Font Size: a A A

Dynamical computation with echo state networks

Posted on:2008-04-06Degree:Ph.DType:Dissertation
University:University of FloridaCandidate:Ozturk, Mustafa CanFull Text:PDF
GTID:1440390005951165Subject:Engineering
Abstract/Summary:
Echo state networks (ESN) were recently proposed as a new recurrent neural network (RNN) paradigm. ESN couples dynamics with computation in novel ways by conceptually separating RNN into two parts: a recurrent topology of nonlinear PEs that constitutes a "reservoir of rich dynamics" and an instantaneous linear readout. The interesting property of ESN is that only the memoryless readout is trained, whereas the recurrent topology has fixed connection weights. This reduces the complexity of RNN training to simple linear regression while preserving a recurrent topology, but obviously places important constraints in the overall architecture that have not yet been fully studied.; Present design of fixed parameters of ESN relies on the selection of the maximum eigenvalue of the linearized system around zero (spectral radius). However, this procedure does not quantify in a systematic manner the performance of the ESN in terms of approximation error. In this study, we proposed a functional space approximation framework to better understand the operation of ESNs, and proposed an information-theoretic metric (the average entropy of echo states) to assess the "richness" of the ESN dynamics. We also provided an interpretation of the ESN dynamics rooted in system theory as families of coupled linearized systems whose poles move according to the input signal dynamics. With this interpretation, we put forward a design methodology for functional approximation where ESNs are designed with uniform pole distributions covering the frequency spectrum to abide to the "richness" metric, irrespective of the spectral radius. A single bias parameter at the ESN input, adapted with the modeling error, configures the ESN spectral radius to the input-output joint space. Function approximation examples compare the proposed design methodology versus the conventional design.; On further investigating the use of ESNs for dynamical pattern recognition, we postulated that ESNs are particularly well suited for dynamical pattern recognition and we proposed a linear associative memory (LAM) as a novel readout for ESNs. From the class of LAMs, we adopted the minimum average correlation energy (MACE) filter because of its high rejection characteristics that allow its use as a detector in the automatic pattern recognition literature. In the ESN application, the MACE interprets the states of the ESN as a two-dimensional image: one dimension is time and the other dimension is the processing element index (space). An optimal template image for each class, which associates ESN states with the class label, can be analytically computed using training data. During testing, ESN states were correlated with each template image, and the class label of the template with the highest correlation is assigned to the input pattern. The ESN--MACE combination leads to a nonlinear template matcher with robust noise performance, as needed in non-Gaussian, nonlinear digital communication channels. We used a real-world data experiment for chemical sensing with an electronic nose to demonstrate the power of this approach. The proposed readout can also be used with liquid state machines eliminating the need to convert spike trains into continuous signals by binning or low-pass filtering.; We applied ESN on interesting real-world problems such as brain machine interface design, water inflow prediction, detection of action potentials in neural recordings, matched filtering in digital communications, channel equalization of a nonlinear channel and compared its performance to other standard techniques. We proposed ESNs for signal processing in the complex domain. The use of ESNs for complex domain is very convenient since system training is equivalent to simple linear regression, which is trivial in the complex domain. The derivatives of the nonlinear activation functions are never necessary since the recurrent part is fixed apriori.; We showed that Freeman model of the olfactory cortex can be con...
Keywords/Search Tags:ESN, Recurrent, State, Proposed, RNN, Dynamics, Dynamical
Related items