This paper concerns the folded concave penalized sparse linear regression(FCPSLR),There is a specific algorithm for sparse linear regression under penalty,but whether the approximate solution satisfying the condition is close enough to the real solution,and the whole process is independent of the algorithm.To address the questions,this paper presents the following threefold results:(1)Any local solution(stationary point)is a sparse estimator,under some conditions on the parameters of the folded concave penalties.(2)Perhaps more importantly,any local solution satisfying a significant subspace second-order necessary condition(S3ONC),which is weaker than the second-order KKT condition,yields a bounded error in approximating the true parameter with high probability.In addition,if the minimal signal strength is sufficient,the S3 ONC solution likely recovers the oracle solution.(3)We apply(2)to the special case of FCPSLR with minimax concave penalty and show that under the restricted eigenvalue condition,any S3 ONC solution with a better objective value than the Lasso solution entails the strong oracle property.(4)Augmented Lagrangian methods with convergence to second-order stationary points in which any constraint can be penalized or carried out to the subproblems are considered in this work. |