Font Size: a A A

Exact Reconstruction Based On The Prior Knowledge Of The Unkown Signal

Posted on:2012-06-24Degree:MasterType:Thesis
Country:ChinaCandidate:Y P LiuFull Text:PDF
GTID:2218330362451036Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
The sampling of conventional signal processing must follow the Nyquist sampling theory, which is sampling density can't be two times less than the highest frequency of the spectrum of simulated signals. As for the sparse or compressible signals, compressed sensing theory breaks out the limitation of the Nyquist sampling theory. The unknown vector x∈R~n has a k -sparse representationθin some orthonormal basis or tight frame, y=φθ∈R~m Rm is the measurement, m<<n, compressed sensing theory says thatθcan be exactly reconstructed from y provided thatφobeys the condition of restricted isometric property.Considering the corrupted measurement: y=Ax+e∈R~m, A is an m by n matrix with column full rank. e ? Rm is an arbitrary and unknown vector of errors satisfy ||e||=|i:e_i|≠0≤ρfor someρ0. This paper according to the spark value of matrix proves that x can be reconstructed precisely in theory provided that the fraction of errors satisfyρ≤ρ~*(ρ~*relies on the ratio between the dimension of measurements and unknown signal). This paper has done a lot of numerical experiments to testify the conclusion, all the statistics can be associated with the theory. In practical applications, we always recover x exactly by solving a simple l1 - minimization, this paper tells a new method to solve the optimization problems:calculate the subgradient of the objective function, the convergence point to the corresponding neural networks is the optimal solution.This paper also introduces the Gelfandn -widths of l p balls in high-dimensional Euclidean space in the case 0<ρ≤1, Gelfandn -width and Kolmogorv n -width state out the bound of the absolute error to signal recovery. This allows us to quantitatively assess the performance of the information operator.
Keywords/Search Tags:Compressive sensing, sparse representation, l1 ? minimization, linear programing, restricted isometry property, Gaussian random matrix
PDF Full Text Request
Related items