Font Size: a A A

Study On Optimization Strategy Of Feedforward Neural Network's Parameters And Structure

Posted on:2005-09-25Degree:MasterType:Thesis
Country:ChinaCandidate:J LvFull Text:PDF
GTID:2168360125964575Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Solving all kinds of application problems with neural network is to make the optimum solutions to the real problems correspond with the steady state of neural network and to make optimization procedure map to the evolution course of neural network systems by making use of the parallel computing ability of nerve cells in the network. But with the dimension to the problems and complication of problems becoming larger, the results of single optimization algorithm are not perfect and its improvements in the performance are limited. Because of the above reasons, the idea of algorithms mixture has been an important and efficient way to improve the performance of optimization algorithms. The idea makes a single algorithm learn from strong points of other single algorithm to offset its own shortcoming to get better optimization performance. So it can greatly improve the ability and efficiency of neural network to solve problems in this way.In this paper, some research on the error back-propagation algorithm, simulated annealing algorithm and genetic algorithm of neural network has been done . Their characteristic and optimization performance were analyzed and compared. A new adaptive hierarchical genetic algorithm (HGA) is also researched and it provided a new method to solve the problem that could not optimize the topological structure of neural network simultaneously. The HGA simulated the biological structure and evolution mechanism in nature , it has strong ability of structure presentation and global optimization. The HGA adopts a hybrid coding method which mixed binary coding method and floating point coding method. The HGA has great learning efficiency because it can optimize the weights (including node's threshold) and the topological structure of neural network at the same time. Adopting adaptive crossover and mutation probability could accelerate the genetic speed and avoid prematurity. The results of numerical simulation research demonstrated that the HGA has better optimization performance.Moreover, two hybrid optimization strategies are also put forward in this paper.They are BP+SA and HGA+BP. In the BP+SA hybrid optimization strategy, BP algorithm is the main framework and SA is introduced into the learning process based on the characteristics of BP's searching on gradient and SA's probability jumping abruptly. It could utilize not only the guide of gradient descending to improve the local searching capability but also SA's probability jumping abruptly to achieve the final global convergence. In the HGA+BP hybrid optimization strategy, BP operation is introduced into HGA algorithm. So it makes full use of the global optimization of GA to search the possible extremum area in a large scope and it uses BP to fast search near the extremum point. At the same time, the structure of network can be optimized because of hierarchical coding. Then global optimization and fast search are combined. It has got satisfied compromise of approaching precision, generalization ability and model complexity of network in HGA+BP. This two hybrid optimization strategies are applied in the subject of "the chaotic time series prediction". It demonstrated that hybrid optimization strategies improved mostly the study performance and generalization ability of feed forward neural network.
Keywords/Search Tags:artificial neural network, optimization, BP algorithm, HGA algorithm, SA algorithm, hybrid optimization strategy
PDF Full Text Request
Related items