Font Size: a A A

Neural networks and handwritten signature verification

Posted on:1992-09-07Degree:Ph.DType:Thesis
University:Stanford UniversityCandidate:Pender, Dorothy AFull Text:PDF
GTID:2478390014999222Subject:Engineering
Abstract/Summary:
In this thesis, various neural networks are trained to detect true signatures of several subjects, and reject forgeries of these same signatures. The forgeries presented to the network are unpracticed forgeries, or nonprofessional, casual forgeries. Casual forgeries are by far the most prevalent type of forgery, resulting in large monetary losses nationwide.; A small number of training signatures are required to teach a neural network to perform the signature classification. Forgeries are required in the training set, but it is shown that true signatures of other subjects suffice as forgeries. In this way, actual forgeries need not be collected to train a network. The trained neural networks are tested on new signatures, resulting in an error rate of 3% rejection of true signatures as forgeries, and 3% acceptance of forgeries as true signatures.; The training sets for the signature verification networks are small, finite sets. The dynamic behavior of a neural network being trained on an infinite input set is well known, while being relatively unknown for the finite training sets. New theoretical results are presented here describing the behavior of a neural network that is trained on a finite size training set. Under special conditions, the optimal weights of a neural network are equivalent to the optimal weights of the synthetic discriminant function, which is another signal processing tool for performing classification.; The backpropagation learning algorithm for neural networks is examined in greater detail to reveal limits on the learning parameters. The mean square error performance surface is examined for a single nonlinear neuron, and shown to be similar to the performance surface for a single linear neuron, although there are choices of inputs and outputs which give rise to a performance surface that is very different from that of a single linear neuron. Lastly, an alternative to the backpropagation learning algorithm is presented which utilizes a step output function for binary classification. This modified algorithm is applied to the exclusive-or problem, which tests the networks ability to nonlinearly partition the input space. The modified algorithm learns the exclusive-or classification much faster and with fewer difficulties than the conventional backpropagation algorithm.
Keywords/Search Tags:Neural network, Forgeries, True signatures, Algorithm, Classification, Trained
Related items