Font Size: a A A

Research On Time Series Prediction Method Based On Deep Neural Network

Posted on:2024-07-29Degree:MasterType:Thesis
Country:ChinaCandidate:L W LiuFull Text:PDF
GTID:2530307121983489Subject:Electronic information
Abstract/Summary:PDF Full Text Request
A time series is a set of random variables arranged in time,he order of data points according to the sequence of time occurrence and form a sequence.Because of the different time intervals,time series data are collected differently,but they are all used to describe the state or degree of change of a phenomenon over time.Time series data are ubiquitous in people’s lives,and are widely used in various industries such as mathematics and finance,signal processing,weather forecasting,earthquake prediction,etc.These time series data are of high value in the corresponding research fields,and the accurate and effective analysis and utilization of these data can help reduce labor costs,improve productivity,and increase economic returns.Because time series data contain a variety of important information generated by human production activities,modeling time series has been a very important research direction in the field of academic research.Traditional time series research methods have focused on using known parametric models in various specialized fields,but with the development of modern machine learning,data availability and computational power have gradually increased,a purely data-driven approach is becoming the mainstream means of studying time series,and machine learning has gradually become an important component of time series forecasting,with deep learning in particular becoming the mainstream in recent times.Deep neural networks are a more sophisticated algorithm for machine learning,which focuses on the use of models and enables the automatic extraction of useful features from large amounts of data.Therefore,using deep learning to study time-series data is the main research direction nowadays.And multivariate time series is a more complex and contains more information time series.The multivariate time series prediction problem is one of the more difficult and meaningful behaviors in time series prediction problems.The effective analysis and accurate prediction of multivariate time series is helpful to analyze the time series data more comprehensively and get more accurate results.Transformer is the most outstanding Sequence To Sequence(Seq2Seq)model architecture in recent years,and the current time series prediction algorithms based on Transformer mainly use the attention mechanism within Transformer to process a sequence of data.Unlike the Recurrent Neural Network(RNN)based approach,Transformer allows the model to access any part of the historical data without considering the distance,which makes it potentially more suitable for learning repetitive patterns with long-term dependencies,but it still suffers from the problem that the model tends to overfit when the amount of data is too small and The problem of affecting the model effectiveness and the deep learning architecture Transformer used in general time series prediction tasks can only capture the directional information of vectors in multivariate time series prediction,which is a special kind of temporal relationship,leading to the abandonment of some informative components.In this thesis,to address the problem that Transformer models have been overfitted when the amount of data is insufficient,we first propose SSMT,a time-series prediction algorithm with the combination of State Space Models(SSMT)and Transformer.state space models,as a specialized statistical model,have stronger a priori information when learning,so that when the a priori information the amount of data for model training can be reduced when it is correct.According to the characteristics of the state space model,SSMT uses Transformer to map data features into the parameters of the state space model,then uses the state space model to predict the probability distribution of the time series at each time point,and finally makes detailed prediction by the probability distribution.Through testing on real data sets,the algorithm improves the problem that Transformer is prone to overfitting and achieves better results on small data sets with small data volume.Secondly,this thesis addresses the problem that Transformer cannot repeatedly capture useful information on multivariate time series,resulting in less accurate prediction,and proposes that a feature space can be generated to show weak correlation between tasks and strong correlation within tasks,so that the model can distinguish dynamics from temporal and spatial perspectives,respectively,and thus improve the accuracy of multivariate time series prediction,so this thesis specifically proposes the STN-TFT algorithm,which combines the TFT framework with the STN module to solve the above mentioned problems.
Keywords/Search Tags:Time series prediction, recurrent neural network, deep learning, self-attention mechanism, Transformer, state space model, Time normalization, space normalization
PDF Full Text Request
Related items