Font Size: a A A

Low-rank Tensor Representation Learning Methods And Applications

Posted on:2022-04-22Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y N QiuFull Text:PDF
GTID:1488306317994149Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
With the development of sampling technology and internet technology,the presence of high-order data is found in various fields of scientific research and real-world applications.How to explore the inherent characteristics and essential structure from these data for learning tasks has become a hot research topic in the fields of machine learning,data mining,statistical analysis,and pattern recognition.This thesis mainly focuses on the low-rank tensor representation learning,and the research includes two aspects:for the unsupervised representation learning,we study the low-rank ten-sor representation for tensor completion problem;for tensor data contains limited supervi-sory information,we study semi-supervised low-rank tensor representation learning for semi-supervised clustering and semi-supervised classification problem.The main contributions of this thesis are given as follows:1.We establish noisy tensor completion method,and analyze its non-asymptotic upper bound.Existing low-rank tensor completion methods often assume that the observed data is noise-free,which prevent their further extension in practical applications.We propose a noisy tensor completion method based on least squares estimator and low-rank tensor ring to recover the tensor data with partial and noisy observations.To exploit the statistical performance of the proposed method,we analyze its non-asymptotic upper bound,and further proves the upper bound is near-optimal in minimax sense.In order to solve the optimization problem,we proposed two efficient optimization algorithms,one of which is tailored to handle large-scale tensors by replacing the tensor ring nuclear norm minimization of the original tensor equivalently with that of a much smaller one in a heterogeneous tensor decomposition framework.Experimental results on both syn-thetic and real-world data demonstrate the effectiveness and efficiency of the proposed method in recovering noisy incomplete tensor data,compared with state-of-the-arts ten-sor completion methods.2.To fully explore the low-rank structure in spectral domain,we propose a low-tubal-rank based high-order tensor completion problem.The definition of tensor tubal rank is limt-ted to the third-order tensor and is sensitive to the choice of the third order.To alleviate such bottleneck,we develop a tensor unfolding method for high-order tensors,which can avoid the sensitivity problem and can be extended to high-order tensor.We further propose two corresponding tensor tubal nuclear norm based on the unfolding scheme,namely overlapped tubal nuclear norm and latent tubal nuclear norm,which explore the low-rank structure of high-order tensor in all unfolding schemes and partial unfolding schemes,respectively.In addition,we analyze the deterministic upper bound of the esti-mation error of the proposed two tubal nuclear norm based completion methods.The ex-perimental results show that the proposed two tensor completion methods achieve better completion performance compare with the state-of-the-arts low-rank tensor completion methods.3.To extract the low-rank representation from tensor data with limited label information,we proposed a non-negative Tucker decomposition method based on label propaga-tion.Most of the existing non-negative Tucker decomposition methods are unsupervised learning methods,the label information of tensors cannot be effectively utilized.We claim that the low-dimensional representation extracted by non-negative Tucker deocm-position can be treated as the predicted soft-clustering coefficient matrix and therefore can be learned jointly with label propagation in a unified framework.In order to solve the optimization problem,an efficient accelerated proximal gradient algorithm is developed to solve the optimization problem.Finally,the experimental results on five benchmark image data sets for semi-supervised clustering and classification tasks demonstrate the superiority of this method over state-of-the-arts methods.4.To learn the low-rank tensor representation with limited pairwise constraints(must-link and cannot-link),we proposed a generalized graph regularized non-negative tensor decomposition framework.Specifically,when the pairwise constraints are not avail-able,we develop an unsupervised graph regularized non-negative Tucker decomposi-tion method by constructing the nearest neighbor graph to maintain the intrinsic mani-fold structure of tensor data.When the limited pairwise constraints are available,unlike most existing semi-supervised learning methods that only use the pre-given supervisory information,we propagate the constraints through the whole data set and then build a semi-supervised graph Laplacian matrix,by which we can formulate the semi-supervised graph regularized non-negative Tucker decomposition method.In order to solve the op-timization problem,we develop a fast and efficient alternating proximal gradient based algorithm to solve the optimization problem and show its convergence and correctness.Experimental results on unsupervised and semi-supervised clustering tasks using four image data sets demonstrate the effectiveness and high efficiency of the proposed methods.
Keywords/Search Tags:tensor decomposition, low-rank tensor completion, non-negative tensor decomposition, graph and semi-supervised learning, tensor nuclear norm
PDF Full Text Request
Related items