Font Size: a A A

Research On LSTM Method For Deep Learning Of Internet Of Things Time Series Data

Posted on:2019-04-05Degree:MasterType:Thesis
Country:ChinaCandidate:X F XieFull Text:PDF
GTID:2428330545972911Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the continuous improvement of the computer's performance,deep learning has also achieved considerable development.It has made great progress in computer vision,machine translation,and voice recognition.Internet of things(IoT)and Smart City deployments are generating large amounts of time-series sensor data.Applying deep learning to these domains has been an important topic of research.Effective analysis of time series data of Internet of things can produce very valuable information,which will provide a good decision basis for the application of Internet of things.In this paper,we first study the basic structure of the Internet of things and the characteristics of time series data generated in this framework.And detailed data pre-processing for these features,which laid the foundation for the next model construction.Because of the traditional statistical method,machine learning method,feedforward neural network method and recurrent neural network method,there are some limitations,that is,the prediction accuracy is not high,the time dependent features can not be caught,only suit the short-term prediction,the gradient disappearance or the gradient explosion are easy to occur.Through deep research on deep learning methods such as feed-forward neural network,recurrent neural network and LSTM neural network,this paper proposes an IoT time series data forecasting model based on LSTM neural network and multi-feature fusion.This model is based on the LSTM network and incorporates a multi-feature fusion layer,which fills in the disadvantage that the LSTM itself cannot extract multiple features.It can not only finish the refinement of the time sequence relationship in the IoT data,but also complete the extraction of the influence of other external factors on future data.The experimental results also show that the proposed LSTM and multi-feature comfusion IoT time series data prediction model can well predict long-term,short-term,and even weak time-dependent IoT time series data.Furthermore,for the data anomalies that may exist in the current IoT data,this paper combines the previously proposed IoT time series data prediction model and uses normal data sets to train it to generate the prediction residuals.Then we good use of the advantages of prediction residuals,and further use it to build the Gaussian Naive Bayesian model to achieve anomaly detection of the IoT time series data.This is not a simple model fusion.It is well-utilized to use the improved LSTM time-series prediction model to generate prediction residuals that are inherently time-series and thus maintain a strong connection between the anomalous event before and after the event.Further exert Gaussian Naive Bayesian model's excellent classification ability to achieve anomaly detection.The experimental results also show that the anomaly detection model presented in this paper is superior to other models in the comparison experiment.
Keywords/Search Tags:Deep Learning, IoT, Time Series Data, LSTM, Anomaly Detection, Gaussian Naive Bayesian
PDF Full Text Request
Related items