The Long short-term memory network in deep learning is a kind of neural network model which can model the time series data.However,In practical application,it is found that there are two deficiencies in the long short-term memory network:first,since the long short-term memory network only adds the state values of the previous moment,the network is not fully utilized for the state values of the previous multiple moments;Second,Jozefowicz pointed out in 2015 that when initializing parameters in a long short-term memory network,if the bias of the forget gate elements in the network is initialized to a smaller number,the problem of poor training results will arise.Therefore,in view of the above shortcomings,this thesis improves the network structure of long short-term memory network,and proposes a network model with faster computing speed,faster convergence speed and more accurate prediction capability.Firstly,some problems of deep neural network are summarized in this thesis.It includes the network structure of the deep neural network,the commonly used loss function,the back-propagation training algorithm,the commonly used activation function and several methods to prevent overfitting.Secondly,a long short-term memory network structure has been improved,and an adjust gate long short-term memory network is proposed.In this thesis,the long short-term memory network is optimized as follows:first,a gate control unit called adjust gate is proposed,which replaces the input gate unit and forget gate unit in the original network;Second,when calculating the network input extrusion unit,input gate unit and adjust gate unit,two time hidden layer information are added.At the same time,on the basis of adjust gate long short-term memory network,a back-propagation algorithm along time is derived,which can be used to train the adjust gate long short-term memory network.Finally,the closing price of the Shanghai Stock Exchange Index is predicted by using the long short-term memory network and the adjust gate long short-term memory network proposed in this thesis.The results show that the adjust gate long short-term memory network not only improves the training speed by 12.80%,but also converges faster and has higher stability than the long short-term memory network.The trained model is used to predict the test sample set,and the error between the predicted value and the real value is reduced by 9.57%.The numerical experiments show that the adjust gate long short-term memory network proposed in this thesis has the following advantages:first,the memory unit in the network can store the state information at two moments,which increases the modeling ability of the network to the dynamic time series data,and improves the predicting accuracy of the network to dynamic time sequence data;Secondly,the adjust gate unit is used in the network,which not only enhances the stability of the network training,but also improves the computing speed and convergence speed of the network. |