In this paper, we consider variable selection and model estimation in partial linear single index models under the assumption that the vector of regression coefficients is sparse. We apply penalized spline to estimate the nonparametric function and SCAD penalty to achieve sparsity in both the linear part and single index part of the model. Under some mild conditions, it is shown that the penalized estimators have the oracle property, in the sense that it is asymptotically normal with the same covariance ma-trix and mean that they would have if the zero coefficients were known in advance. Our model owns a least square representation, therefore standard software can be im-plemented and no extra programming effort is needed. In the mean time, parametric estimation, variable selection and nonparametric estimation can be realized in one step, which incredibly increased computational stability. The finite sample performance of the penalized estimators is evaluated through Monte Carlo studies and illustrated with a real data set. |