Font Size: a A A

Implementation of a New Sigmoid Function in Backpropagation Neural Networks

Posted on:2012-12-18Degree:M.SType:Thesis
University:East Tennessee State UniversityCandidate:Bonnell, JeffFull Text:PDF
GTID:2458390011455478Subject:Applied Mathematics
Abstract/Summary:
This thesis presents the use of a new sigmoid activation function in backpropagation artificial neural networks (ANNs). ANNs using conventional activation functions may generalize poorly when trained on a set which includes quirky, mislabeled, unbalanced, or otherwise complicated data. This new activation function is an attempt to improve generalization and reduce overtraining on mislabeled or irrelevant data by restricting training when inputs to the hidden neurons are sufficiently small. This activation function includes a flattened, low-training region which grows or shrinks during backpropagation to ensure a desired proportion of inputs inside the low-training region. With a desired low-training proportion of 0, this activation function reduces to a standard sigmoidal curve. A network with the new activation function implemented in the hidden layer is trained on benchmark data sets and compared with the standard activation function in an attempt to improve area under the curve for the receiver operating characteristic in biological and other classification tasks.
Keywords/Search Tags:Function, New, Backpropagation
Related items