In recent years,with the continuous development of Internet of Things technology and 5G networks,people can obtain time series data more conveniently and quickly.In the fields of finance,transportation,and electricity,the research on time series data has always been a hot spot for scholars.Compared with traditional machine learning methods that require human design rules to extract data features,deep learning methods can train models and complete target tasks in a data-driven manner.This paper will focus on single-step prediction and multi-step prediction based on deep learning methods.The main contents of this paper are as follows:(1)For single-step prediction,this paper proposes a network model BiLSTMA based on the encoder-decoder structure and combining the bidirectional long and shortterm memory network and the attention mechanism.This model mainly addresses the problem of only extracting forward sequence features in traditional methods and the inability to select hidden states in a focused manner.The encoder structure in this model uses a bidirectional LSTM network to extract the forward and backward features of the input sequence.Afterwards,at the attention mechanism layer,the model will perform a weighted summation of the bidirectional hidden states extracted by the encoder to obtain the hidden states to help the decoder make predictions.In this paper,experiments on different datasets prove the rationality and effectiveness of the model.(2)For multi-step prediction,this paper proposes a model SWLHT based on memory mechanism and Transformer structure that can be used for short window and long horizon prediction manner.In response to the accumulation of errors in previous work and the prediction fragmentation caused by Transformer,this model introduces a memory module and proposes a multi-segment prediction manner to alleviate these problems.The memory module allows the model to know the information of the previous step when processing the current input,which increases the flow of information and alleviates fragmentation.At the same time,according to the sequenceto-sequence characteristics of the Transformer model,this paper proposes a multisegment prediction manner to alleviate the accumulation of errors.Finally,this paper proves the superiority of the proposed model in long-term prediction through experiments. |