Font Size: a A A

Tensor Completion Algorithms Based On Deep Learning

Posted on:2021-12-14Degree:MasterType:Thesis
Country:ChinaCandidate:H L LuFull Text:PDF
GTID:2518306122974639Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Monitoring the performance of a large network is crucial for network status tracking,performance optimizing,traffic engineering,anomaly detection,fault analysis,etc.However,in order to obtain a full network performance view,there is a high measurement cost challenge.In order to reduce the measurement cost,a subset of paths or time intervals of the network can be measured while inferring the remaining network data by leveraging their spatio-temporal correlations.The quality of missing data recovery highly relies on the inference algorithms.Tensor completion has attracted some recent attentions with its capability of exploiting the multi-dimensional data structure for more accurate missing data inference.However,current tensor completion algorithms only model the three-order interaction of data features through the inner product,which is insufficient to capture the high-order,nonlinear correlations across different feature dimensions.In recent years,deep learning has yielded immense success on speech recognition,computer vision and natural language processing.Thus,in this paper,we strive to develop techniques based on deep learning to tackle the shortcomings in tensor completion on the basis of inner product.The main contributions are outlined as follows:First,to address the limitations of existing tensor completion solutions,we propose a novel Neural Tensor Completion(NTC)scheme.It can effectively model three-order interaction among data features with the outer product and build a 3D interaction map.Based on which,we apply 3D convolution to learn features of high-order interaction from the local range to the global range.We demonstrate theoretically this will lead to good learning ability.We further conduct extensive experiments on two real-world network monitoring datasets,Abilene and WS-DREAM,to demonstrate that NTC can significantly reduce the error in missing data recovery.When the sampling ratio is low at 1%,the recovery error ratios on the testing data are around 0.05(Abilene)and 0.13(WS-DREAM)when using NTC,but are 0.99(Abilene)and 0.99(WS-DREAM)using the best current tensor completion algorithms,which are 21 times and 8 times larger.In this paper,another Fusion Neural Tensor Completion(FuNTC)model is proposed to solve the problem that the NTC model can only extract the nonlinear complex structure information between the underlying feature dimensions.Under the framework of neural network,NTC and tensor factorization model share the same feature embedding,which can simultaneously extract nonlinear feature information and linear feature information efficiently,and achieve higher precision data recovery.The FuNTC inherits the advantages of the NTC and makes up for its inability to capture higher-order linear correlations.Extensive experiments on two real-world datasets,Abilene and WS-DREAM,show significant improvements of FuNTC over the NTC method.When the sampling ratio is low at 1%,the recovery error ratios of the best current tensor completion algorithms is 24 times(Abilene)and 8 times(WS-DREAM)larger than FuNTC,respectively.
Keywords/Search Tags:Tensor completion, Sparse network monitoring, Neural network
PDF Full Text Request
Related items