Font Size: a A A

Modularized and Boolean neural networks for image processin

Posted on:1996-05-13Degree:Ph.DType:Dissertation
University:University of DelawareCandidate:Chang, Liang-Wen BFull Text:PDF
GTID:1468390014988547Subject:Electrical engineering
Abstract/Summary:
In this dissertation, two types of neural networks are introduced and analyzed. The modularized neural networks are continuous weighted neural networks, and MIN/MAX Boolean neural networks are discrete (binary) weighted neural networks. In addition, one optimization method, the genetic annealing algorithm, is devised to improve the supervised learning of neural networks.;A modularized neural network (MNN) is introduced to combat signal distortion and noise reduction. The MNN is compared with a non-modularized neural network when trained to compute functions such as rank ordering and median type operators. Based on the result, the MNN containing fixed-weight and trainable modules can be efficiently trained to filter noisy signals. The partitioning of the input space of the MNN is determined by an analysis of rank order information and/or relative magnitude information in windowed observed data. The relation between the rule for partitioning input patterns and the probabilistic characteristics of the signal and noise is discussed. The modularized neural networks are tested on stationary Markov processes with various types of noise, nonstationary signal waveforms with additive noise, and images with additive noise.;A MIN/MAX Boolean neural network is constructed by minterms and maxterms of Boolean functions. The algebraic structure of Minimum and Maximum operators is discussed. Almost all nonlinear, selective filters can be expressed by MIN/MAX Boolean neural networks. There is a subset of MIN/MAX Boolean neural networks, constrained Boolean neural networks, which reduces the complexity of MIN/MAX Boolean neural networks by excluding outliers from window inputs. The MIN/MAX Boolean neural networks are tested on stationary Markov sequences with various noise, nonstationary multi-tone signals with various noise, and images with additive noise. Both modularized neural networks and MIN/MAX Boolean neural networks are compared with other linear and nonlinear filters structurally.;A new optimization method, genetic annealing algorithm, which unifies a genetic algorithm and a simulated annealing algorithm is developed to globally search for the weights of neural networks. The method is obtained by observing the similarity of natural evolution and material evolution. The process of simulated annealing serves as a single mutation among population. The mathematical expression of the genetic annealing algorithm is discussed and the five standard test bed functions are tested. Simulations show that the algorithm gives promising results. Practically, the genetic annealing algorithm is used to optimize the discrete (binary) weights of MIN/MAX Boolean neural networks. The application aspect of the algorithm is also investigated.
Keywords/Search Tags:Neural networks, Modularized, Algorithm, Images with additive noise
Related items