Font Size: a A A

Improved Low Rank Tensor Completion Algorithm And Applications

Posted on:2017-05-19Degree:MasterType:Thesis
Country:ChinaCandidate:H M LiuFull Text:PDF
GTID:2308330503970007Subject:Mathematics
Abstract/Summary:PDF Full Text Request
In recent years, the acquisition of massive high-dimensional data becomes easier with the rapid development of modern network technology, computer communications and sampling technology. Missing values are very common in the acquisition process and the low rank tensor completion aims to recover all missing elements according to the low rank property of the investigated data. Although matrix completion algorithms can recover the missing elements, the matrix representation of high-dimensional data will result in the curse of dimensionality, the over-fitting and even destroying data structure, when the analyzed high-dimensional data has a very complex structure.Therefore, as the higher generalization of matrix, tensors can properly describe the high-dimensional data structure. Tensor completion generally uses the minimization framework of low n-rank, but it requires multiple large-scale singular value decomposition, which will cause high computational complexity. The n-rank of a tensor is generally given by experience that affects the complement performance. The main work is as follows:(1)We introduces briefly the models, classic solving algorithms and applications of compressed sensing, low rank matrix reconstruction and low rank tensor completion. To better understand and solve the problem of low rank tensor completion, the fundamental knowledge of vector, matrix and tensor is provided.(2) This dissertation summarizes the existing algorithms to low rank tensor completion. The domestic research on the low rank tensor completion is still in its infancy. It briefly reviews the existing mainstream algorithms to low rank tensor completion and evaluates the pros and cons of each algorithm. These algorithmsbasically apply rank minimization framework to solve the problem of low rank tensor completion.(3) An improved algorithm of low rank tensor completion is proposed. First, we adopt the classic Tucker approximation as the completion model in the presence of Gaussian noise. Then, singular value decomposition is replaced by QR decomposition during the process of iterative update. Finally, the performance of the improved low rank tensor completion algorithm is validated in a variety of data sets. Compared with the classic fast low rank tensor completion algorithm and high accuracy low rank tensor completion algorithm in the relative square error and running time, the proposed algorithm has better performance.(4) This dissertation discusses the low-rank CP tensor decomposition from the viewpoint of full Bayesian. The existing low-rank tensor completion algorithm generally adopts the minimization framework of low n-rank decompositions, but its significant flaw is n-mode rank usually by experience rather than selecting the optimal rank via self-learning, which will affect the completion results partly. Low-rank tensor algorithm of full Bayesian CP decomposition solves the low-rank tensor completion problem by Bayesian probability theory. The advantages of this algorithm is that the rank of a tensor is no longer preset by experience but by self-learning to obtain its optimum value.
Keywords/Search Tags:Compressed Sensing, Matrix Completion, Low Rank Tensor Completion, QR Decomposition, Improved Low Rank Tensor Completion, Low Rank Tensor Full Bayesian CP Decomposition
PDF Full Text Request
Related items