For structural optimization of neural networks, i.e., the challenging problem todetermine the number of hidden layers and the number of neurons, we propose astructural optimization algorithm based on an improved genetic algorithm (IGA). Theproposed algorithm is then employed to approximate nonlinear functiony=e-(x+1)2+e-(x+1)2 in MATLAB. Extensive simulation demonstrates that theproposed optimization algorithm is efficient, improves adaptability and generalizationability of neural networks, and holds rapid global convergence.The main contributions in this thesis are as follows:1. We do thorough analysis of the advantages and disadvantages of geneticalgorithms from the steps of parameter coding, initial population, fitness function design,genetic operation and design, respectively.2. We propose an improved genetic algorithm to overcome deficiency of SGA byusing parameter coding, three-level hierarchical structure based chromosome coding,fitness function selection, adaptive adjustment of fitness index proportion coefficients,genetic operation, elitist strategy, adaptive selection and crossover probability,respectively. The experiment results show that IGA is an efficient approach to addressstructural optimization of neural networks.3. We use IGA to approximate nonlinear functiy=e-(x+1)2+ex+12 on(inMATLAB. The simulation results verify efficiency of IGA.Finally, we conclude with a discussion of our contributions in the thesis, and sketchfuture work. |