| In this paper,we consider the stochastic collocation method via non-convex compressive sensing methods,mainly including ?qminimization and transformed ?1(for short,TL1)minimization,and their applications in partial differential equations with random inputs.The main results of this paper can be listed as the following two parts: 1)By using the norm inequality between ?qand ?2and the square root lifting inequality,we present several new theoretical estimates regarding the recoverability for both sparse and non-sparse signals via ?qminimization.At the same time,based on the work of Cand`es,we establish new results of TL1 minimization about the accuracy of the reconstruction from underdetermined measurements which improve on the earlier estimates derived by Zhang and Xin,and have the advantage of being more elegant.2)We then combine this method with the stochastic collocation to identify the coefficients of sparse orthogonal polynomial expansions,stemming from the field of uncertainty quantification.We obtain recoverability results for both sparse polynomial functions and general non-sparse functions.We also present various numerical experiments to show the performance of the ?q and TL1 algorithm.In each part,we first present some bench mark tests to demonstrate the ability of ?qand TL1 minimization to recover exactly sparse signals,and then consider using the orthonormal polynomials expansions to approximate some classical analytical functions,and present the advantage of this method over other optimization methods(for example,standard?1,reweighted ?1minimization,?1-2minimization).Finally,both partial differential equations and ordinary differential equations with random inputs are considered and we compare the approximation error of our quantity of interest(QoI)in the numerical experiments.All the numerical results indicate that the ?qmethod performs better than standard ?1and reweighted?1minimization.As for the DCA-TL1 method,it also outperforms than standard ?1and ?1-2minimization. |