Font Size: a A A

Research On Optimizing The Neural Network With Random Weights By Backtracking Search Algorithm And Its Application

Posted on:2017-01-12Degree:MasterType:Thesis
Country:ChinaCandidate:B Q WangFull Text:PDF
GTID:2308330485982202Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Error Back Propagation (BP) is used as the first selection to optimize the parameters of neural network in a long history. But it is limited by the slow converge and being easy to fall into local minima which badly affect the performance of the model. The neural network with random weights (NNRWs) just has one single hidden layer. The hidden layer (from input layer nodes to the hidden layer nodes) parameters are created randomly and the output layer (from hidden layer nodes to the output layer nodes) parameters are achieved by being computed. Compared with the neural network using BP, the neural network with random weights speeds up the learning time hundreds folds. It also improves the accuracy rate and generalization ability of the neural network.To neural network with random weights, the randomly created parameters of hidden layer improve the performance of the neural network but these parameters also make the model needing more hidden layer nodes which makes the model too big to be inefficient. This problem attracts lots of scholars and using evolutionary algorithm optimizing the random parameters as one of the solutions is proposed. Evolutionary computation algorithm is a heuristic search algorithm based on natural selection and genetic mechanism of biological evolution. Evolutionary algorithms are composed of four parts: genetic algorithm, genetic code, evolution strategy’, and evolutionary programming. In this thesis, we try to use backtracking search algorithm (one evolutionary algorithm) optimizing the random parameters of neural network to improve the performance of the model. But the process of backtracking search algorithm is a greed process which makes the model being easy to fit the validation data set. At the same time, may achieve a bad model to the test data set. So, in this thesis, we propose a loss function with two constraints to handle this problem. The problem is relieved by the data constrain in the double constraints loss function. Generalization ability is an important criterion to evaluate a model. In this thesis, we also propose a new criterion to evaluate the model’s generalization ability.Many disease foretastes such as diabetes, glaucoma are reflected on retina. We can be prevention and treatment these disease by analysis the retina. The segment of retinal blood vessels is the fundamental of retinal analysis. The segment of the retinal blood vessels such as winding and the branch of the retinal blood vessels plays an important role of the accuracy rate to the retinal analysis. In this thesis, using the improved neural network with random weights by backtracking search algorithm on retinal blood vessel’s segment achieves a good performance.In the experiments, the improved neural network with random weights by backtracking search algorithm achieves a good performance. In this thesis, we spend much time discussing the improved neural network with random weights by backtracking search algorithm and its application, but these still exists many problems to be studied.
Keywords/Search Tags:random neural network, double constrains loss function, retinal blood vessel segment, evolution algorithm, backtracking search algorithm
PDF Full Text Request
Related items