High dimensional linear models have been applied to a lot of research areas,e.g.,signal compression,compressive sensing and high dimensional statistics.In this dissertation we propose several algorithms to solve the high dimensional linear regression with sparse constraint,conduct convergence analysis for the proposed algorithms and validate the algorithms with comprehensive numerical examples.First we propose a pathwise semismooth Newton algorithm(PSNA)for sparse recovery in high-dimensional linear models.PSNA is derived from a formulation of the KKT Conditions,which can be applied to solve Lasso problem or elastic net regularized problem.It solves the KKT equations efficiently by continuously seeking the support of the regression coefficients along the solution path with warm start.At each knot in the path,PSNA converges locally superlinearly for the Enet criterion and achieves the best possible convergence rate for the Lasso criterion,i.e.,PSNA converges in just one step at the cost of two matrix-vector multiplication per iteration.Under certain regularity conditions on the design matrix and the minimum magnitude of the nonzero elements of the target regression coefficients,we show that PSNA hits a solution with the same signs as the regression coefficients and achieves a sharp estimation error bound in finite steps with high probability in case of random Gaussian noise.Extensive simulation studies support our theoretical results and indicate that PSNA is competitive with or outperforms state-of-the-art Lasso solvers in terms of efficiency and accuracy.Then we propose a subspace hard thresholding pursuit(SHTP)algorithm and a more stable algorithm SHTP*for group sparse signal recovery,which are able to accommodate strong correlation within groups.SHTP can be regarded as a generalization of hard thresholding pursuit algorithm for group sparse models,it involves two stages:(1)finding a sparse representation with respect to the given dictionary,and(2)finding the coefficient locally.We prove that when the noise level is sufficiently small,SHTP converges in no more than T steps,where T is the number of non-zero groups.SHTP*is a combination of SHTP and a greedy strategy in matching pursuit like algorithms.We validate with numerical experiments that SHTP*can find the right sparse representation with high probability,in cases where SHTP may fail.We prove that SHTP and SHTP*are immune to correlation within groups.Therefore,they are suitable for real world applications,e.g.,microarray gene analysis.Furthermore,we prove that the Gaussian random matrix satisfies the convergence condition with high probability. |