| Time series forecasting plays an important role in various fields such as economics,transportation,energy,statistics,etc.In time series forecasting,point forecasting is an important problem.This thesis focuses on the multi-step point forecasting problem in time series forecasting.Deep learning methods have been widely applied in time series forecasting,this thesis analyzes and summarizes the existing deep learning literature on time series prediction.It is found that the existing deep learning models encounter two major challenges in time series prediction: the data distribution discrepancy problem and the long sequence time series forecasting(LSTF)problem.This thesis is divided into two parts due to the different training ways of time series forecasting models with deep learning.The first is training deep learning models with a hybrid dataset,and the second is training deep learning models with a single-source dataset.This thesis investigates the impacts of differences in the data distribution discrepancy problem and the LSTF problem in the two aforementioned scenarios,respectively.In the mixed-dataset scenario,this thesis analyzes the data distribution discrepancy problem.There is a significant difference in the distribution of sample values from the hybrid dataset because it includes different time series.This thesis proposes the MIR-TS model to address this problem.The MIR-TS model uses residual encoding to achieve input feature aggregation around 0.This operation makes the input of the decoder form a dense distribution and improves the forecasting ability of the decoder.Additionally,the MIRTS model applies multi-level decoding aggregation to decode different residual features with independent decoders,leading to enhancing the model’s feature reuse capability.The effectiveness of the MIR-TS model is tested on two time series hybrid datasets.In the single-source scenario,this thesis analyzes the data distribution discrepancy problem and the LSTF problem.This thesis proposes a normalization method in the time dimension to reduce the impact of the distribution discrepancy problem.Then,this thesis analyzes two statistical models in the LSTF problem.The results indicate that pure deep learning methods have difficulty in achieving the best LSTF performance.Therefore,this thesis proposes a hybrid time series forecasting model that combines the Transformer encoder and the linear decoder,which achieves the fusion of linear regression and deep learning models.The performance of the hybrid time series forecasting model is tested on six real-world datasets,and the results show that the model achieves advanced performance on both six single-source datasets and LSTF problems. |