Font Size: a A A

The Low Rank Approximation Of Matrix And Its Application

Posted on:2018-07-30Degree:MasterType:Thesis
Country:ChinaCandidate:M K XuFull Text:PDF
GTID:2310330536988344Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The low-rank matrix approximation is a sparse representation,which uses a lower-rank matrix to approximate the original matrix.Not only it can retain the main features of the original matrix,but also can reduce the data storage space and computational complexity.Based on this,this paper studies how to reduce the rank of the matrix,discusses the algorithm of the reduced rank optimization,and how to apply to face recognition.The main work and achievements which include three aspects are summarized as follows:A.To overcome the shortage of high-rank nuclear matrix singular value decomposition existing in low-rank matrix recovery model,the paper proposed low-rank matrix recovery model based on non-negative matrix factorization.Non-negative matrix factorization(NMF)applied to the low-rank matrix,which could quickly deal with the problem of the decomposition matrix of low-rank and avoid large-scale nuclear matrix singular value decomposition.Then the algorithm used alternarting directions method of multipliers(ADMM).Experimental results in ORL,AL_Gore and Windows databases show that low-rank recovery model based NMF has higher recognition rate,better reduction rank than other traditional low-rank recovery model.B.As Laplacial Eigenmaps only keeps the local neighborhood information,can't deal with the new samples,the paper proposes face recognition with Laplacian Eigenmaps based on 2D-KPCA(2D-KPCA+LE).First of all,the 2DPCA is applied to the training sample matrix.So we can retain the structural information of sample space and a low-rank feature matrix.Using the KPCA to embed low dimensional space into high dimensional space and to extract nonlinear properties.The kernel function needs a lot of storage.So we use Laplacial Eigmaps to reduce dimensions again.Experimental results in ORL and FERET face databases show that the 2D-KPCA+LE has a higher recognition rate and lower the complexity of the algorithm than other manifold learning methods.C.To overcome the shortage of high time complexity for singular value decomposition,the paper presented a quantum phase estimation for finding a singular value using the quantum algorithm parallelism.A matrix which to be decomposed was firstly transformed into a Hermitian matrix.Then a unitary matrix which had the same eigenvalues was generated via unity transformation.By quantum phase estimation,we could obtain the approximate phase about a eigenvalue.The first memory register consisted of m qubits,which made it possible to determine the phase accurate to the nth (n?m) with the probability greater tha s/n~2.Experimental results in Mathematics showed that our algorithm have a lower time complexity,a higher probability in comparison with the classical algorithms.
Keywords/Search Tags:Low-rank matrix recovery, Manifold learning, Singular value decomposition, Quantum phase estimation, Low-rank approximation
PDF Full Text Request
Related items