Artificial neural networks are the functional imitation of simplified model ofthe biological neurons and a completely different approach to artificial intelligencein processing information. They consist of a large number of neurons connected toeach other in a structure. If the structure is different, the type of the network isdifferent. Also the neural networks have the highly nonlinear and usually used tomodel complex relationships between inputs and outputs or to find patterns in data.Artificial neural networks’ handle data in a massively parallel system much like thebiological neurons do allowing for quick processing of large data sets and so thenetworks have the good fault tolerance and memory. This computation technique hasproven to be more effective in addressing problems such as pattern recognition,robot control and knowledge acquisition and so the application of neural network isbroad. Fractional calculus is the promotion of integer-order calculus. It has memoryand hereditary, so the combination of neural networks and fractional calculus isworthy to studying.The main contents of this dissertation are as follows:Firstly, this paper introduces neural networks’ concept, properties and somecommon neural network models. Then it describes the status of research about thecombination of fractional calculus and neural networks at home and abroad. Also adetailed introduction about the definition and properties of fractional calculus andnumerical methods is given. From the definition, the basic algorithm, advantagesand disadvantages, the improvements of algorithm, one chapter gives a detailedintroduction and prepare for the study.Secondly, after the analysis of back-propagation neural network’s algorithm,fractional calculus is applied to this algorithm to construct the BP neural networkbased on fractional calculus. Then using Sigmoid function as the node function ofneural network and actual data as sample set to train the network. By changing thefractional order, learning rate and the number of hidden layer to sum up the impactsof the three parameters to the neural network training. Contrasting with the trainingof integer-order network, it is found that the convergent speed of fractional neuralnetwork is faster than integer-order network’s. However, the convergent error offractional neural network is larger than integer-order network’s. So the varied orderiterative learning law for BPNN is proposed. In fact, they are the laws that adaptiveadjustment of fractional order and turning fractional order to integer order. And the result is that they can make neural network both have faster convergent speed andsmaller convergent error.Thirdly, Mittag-Leffler function is introduced and the relationships between thefunctional values and the parameters α and β are analyzed. Then contrasting theBPNN based on Mittag-Leffler function and analyzing the impact of varied α and βto the training of the network.Finally, fractional neural network is applied to the steering control of the trailerand the output of the neural network is as the input of compartment’s wheel swerve.According to the lag error of the steering tractor and compartment, adjusting theweights of neural network adaptively and ultimately make the tractor andcompartment lag synchronous driving. |