Font Size: a A A

Improve Transformer And Its Role In Financial Time Series Forecasting

Posted on:2024-02-24Degree:MasterType:Thesis
Country:ChinaCandidate:X W XieFull Text:PDF
GTID:2530307124492824Subject:Statistics
Abstract/Summary:PDF Full Text Request
At present,time series forecasting research is a hot topic in the industry.Relevant scholars launch our research from deep learning models.However,the existing methods on time series forecasting have many limitations,which do not consider the complexity of the data and lack a forecasting model that can be processed quickly and has high forecast accuracy.For the time series forecasting task,we improve the Transformer model that use sparse attention which introduces a linear decoder structure,and optimizes the computing memory.Experiments confirm that for large and complex datasets,the model has the effect of small computation and maintaining prediction accuracy.The main work of this article is as follows:1、The SP_TS model introduces the sparse attention mechanism in the Transformer model,which reduces the computational complexity of the Transformer prediction model to Nlog N,and uses linear Decoder to simplify the model decoder structure to improve the prediction speed of the model,and reduce the parameter settings of the model.The experiments on the public datasets ETTh1,ETTm1,energy_data and weather_features confirm that the SP_TS algorithm maintains good prediction accuracy and fast calculation speed in time series forecasting tasks.In the multivariate prediction task,the mean MSE of the SP_TS was at the lowest level of all comparison models.In addition,for the MSE and MAE,the best count sum of 55 SP_TS in different datasets performed best among all comparison models.2、We establish SP_TS network for time series data,and using it on financial time series forecasting.In the experiments on the CSI 300 dataset,we use MSE,MAE,training time,test duration and prediction accuracy as evaluation indicators,and compare ARIMA,SVR,LSTM,Transformer models and Informer model experiments to verify the feasibility and effectiveness of SP_TS algorithms.The results show that the training time of the SP_TS is the shortest,and the prediction accuracy of the SP_TS is better in the long-term output window.Experiments in the Shanghai Composite Index and gold futures data show that SP_TS has advantages in both model error and calculation time.
Keywords/Search Tags:Transformer model, Self-Attention mechanism, Sparse attention, Financial forecasting, Time series
PDF Full Text Request
Related items