Font Size: a A A

Graph Regularized Nonnegative Tensor Train Decomposition Algorithm And Its Application In Feature Extraction

Posted on:2022-05-04Degree:MasterType:Thesis
Country:ChinaCandidate:Z X WuFull Text:PDF
GTID:2518306539961969Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
As the rapid growth of the big data epoch,many data in real life exhibit high data volume and high dimension.Since these real data have specific physical meaning,they are usually nonnegative,and are collectively called nonnegative tensor data.The real nonnegative tensor data often presents "high-dimensional,large-scale and heterogeneous" forms,and its valuable information is contained in a complex potential structure.With the continuous increase of the scale of nonnegative tensor data,the cost of storage and calculation for data analysis and mining also increases correspondingly.In addition,manifold learning technology points out that the observed data are actually mapped to the high-dimensional space through the internal manifold structure.But it is difficult for the existing classical nonnegative tensor decomposition techniques to capture the internal manifold structure when mining the potential information of data.Therefore,how to deal with nonnegative tensor data quickly and effectively,capture the intrinsic manifold structure of the data and extract more low dimensional features with representational significance is an important research topic in the field of feature extraction.In recent years,Nonnegative Tensor Train Decomposition(NTT)has attracted much attention because it can avoid curse of dimensionality to some extent and retain the highdimensional structure information of nonnegative tensor data.In this paper,we mainly study the NTT model and the inner manifold structure of the data encoded by the nearest neighbor graph.A nonnegative tensor train decomposition model which can capture the manifold structure of data is established,and two related optimization methods are designed to solve the model.Finally,the effectiveness and correctness of the theory are verified by experiments.Aiming at the problem that traditional NTT model can't capture the data inner manifold structure,in this paper,the inner manifold structure of nonnegative tensor data is described by constructing the nearest neighbor graph.Based on NTT decomposition,we embed graph regularized term composed of neighbor graphs,and propose a Graph Regularized Nonnegative Tensor Train Decomposition(GNTT)model.The GNTT model not only keeps the superiority of the NTT model,but also captures the manifold structure of nonnegative tensor data,so that it has the ability to learn more recognizable information.Aiming at the optimization of GNTT model,this paper defines the cost function for solving GNTT model,and deduces the update rules based on Multiplicative Updating Method(MU)in detail,thus proposes the GNTT-MU algorithm.The GNTT-MU algorithm is characterized by simple principle and easy implementation.The experimental results show that the features extracted by the GNTT-MU algorithm can well represent the original data.On the three nonnegative tensor data sets,the clustering and classification performance quantitative indicators are ahead of the same type of algorithms.At the same time,the experimental results verify that the GNTT-MU algorithm is not sensitive to hyperparameters and is relatively simple and practical,and is able to converge to a stationary point.But when dealing with large-scale problems,the optimization efficiency of the GNTT-MU algorithm is low.Aiming at the problem of low efficiency of the GNTT-MU algorithm,this paper proves that the derivative of the cost function of GNTT model with respect to the objective variable satisfies Lipschitz continuity,and derives the updating formula based on the Accelerated Proximal Gradient Method(APG),thus proposes the efficient GNTT-APG algorithm.The experimental results show that the clustering and classification performance of GNTT-APG algorithm are close to GNTT-MU algorithm on five nonnegative tensor datasets.But the running time of GNTT-APG algorithm is less than GNTT-MU algorithm on the same task.At the same time,the experimental results also verify that GNTT-APG algorithm is not sensitive to hyperparameters,and can converge quickly,which also proves the practicability of the algorithm.This paper also sets up an experiment for high-order nonnegative tensor data sets,and the results show that with the increase of tensor order,GNTT-APG algorithm can still maintain its efficiency.
Keywords/Search Tags:Nonnegative tensor train decomposition, Graph regularization, Feature extraction, Multiplicative updating method, Accelerated proximal gradient method
PDF Full Text Request
Related items