Traffic congestion hinders the development of cities,and the establishment of real-time and efficient intelligent transportation systems can effectively relieve traffic pressure.As one of the important components of intelligent transportation system,traffic flow prediction can provide real-time and dynamic guidance information for formulating traffic management measures.Existing traffic flow prediction methods have problems such as insufficient feature extraction capabilities,and low accuracy of prediction results.Knowledge distillation can transfer knowledge between networks and improve model performance.Therefore,this paper introduces knowledge distillation into the field of traffic flow prediction,and proposes a graph model of traffic flow prediction based on knowledge distillation.The main research work is as follows:(1)A graph model for traffic flow prediction based on mutual learning is proposed.Diffusion convolutional recurrent neural network is selected as the benchmark model.The mutual learning algorithm of knowledge distillation is used to optimize the model,and then the traffic flow forecasting graph model DMCRNN is obtained.Compared with the original model,the two diffusion convolutional recurrent neural networks in the DMCRNN model learn from each other and guide each other,which can improve the prediction performance of the two networks respectively.The experimental results show that compared with the benchmark model,the DMCRNN model has reduced the error values of the three evaluation indicators on the two datasets,especially in the long-term prediction results,indicating the effectiveness of the proposed method and the robustness of the DMCRNN model better.(2)A graph model for traffic flow prediction based on self-distillation is proposed.Diffusion Convolutional Recurrent Neural Network is used as the benchmark model,and it is optimized by the self-distillation method of knowledge distillation.The DLBCRNN model is proposed by using the historical information of the previous batch in model training for selfdistillation(Self-Distillation from Last Mini-Batch,DLB).The experimental comparison with the benchmark model shows that the DLBCRNN model uses its own historical information to supervise the network,which can reduce the error of prediction results on the basis of the benchmark model.(3)A traffic flow prediction graph model SMKDCRNN based on self-mutual knowledge distillation is proposed.Combining self-distillation and mutual learning,on the one hand,through self-distillation to deepen the connection between the shallow structure and deep structure of the network,on the other hand,the mutual learning algorithm can enable the two networks to guide each other during the training process,making up for the local and Insufficient overall feature perception and extraction capabilities.Compared with the two single knowledge distillation optimization algorithms,the model prediction accuracy of the combination of the two optimization algorithms is higher.At the same time,compared with the latest prediction model,the prediction performance of SMKDCRNN model is better.In summary,this paper proposes three graphical models for traffic flow prediction by optimizing the diffusion convolutional recurrent neural network using knowledge distillation.All three methods can improve the accuracy of model prediction on the basis of the benchmark model,which proves the effectiveness of applying knowledge distillation to the field of traffic flow prediction. |