Font Size: a A A

Effects of scaling on Radial Basis Function Neural Networks

Posted on:2016-06-15Degree:M.Comp.ScType:Thesis
University:Lamar University - BeaumontCandidate:Potter, Phillip E., JrFull Text:PDF
GTID:2478390017475950Subject:Computer Science
Abstract/Summary:
This study attempts to improve the results of a Radial Basis Function Neural Network (RBFNN). The neural network is based on MATLAB's newrb function (Beale, Hagan and Demuth 2014) implemented in Python. The capability of the neural network is enhanced by adding the use of dynamic neuron spreads based on nearest neighbor distances. Also a spread multiplier is added to either increase or decrease the dynamic spreads by a common factor. The original newrb function uses only a constant fixed spread for the neurons. This thesis also investigates an alternative method of calculating nearest neighbor distances based on the classification of the input where the neuron is located. These three methods are compared. This study also examines nine standardization methods of preprocessing the data in terms of neural network's performance. Nine data sets are selected from the UCI Machine Learning Repository (Lichman 2013). Each of the data sets consists of two classifications. Correlations between the dimension of the data and the number of neurons generated during training are studied. With two data sets, the difference between using large and small training sets are examined. The optimal neuron spread method, spread multiplier and data standardization are presented for each of the data sets.
Keywords/Search Tags:Neural network, Function, Data sets, Spread
Related items