Font Size: a A A

Linear Model Selection By Cross-validation

Posted on:2007-11-20Degree:MasterType:Thesis
Country:ChinaCandidate:B WenFull Text:PDF
GTID:2120360212465513Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
In the field of model (or variable) selection in the multiple regression model, many methods are applied to address this problem; for example, the C_p criterion, Akaike information criterion (Akaike, 1974, AIC), Bayes information criterion (BIC), the φ criterion and Cross-validation criterion. because these criteria use a fixed choice of the penalty function. However, a fixed choice may be good in some situations and may not perform well in some other situations ;so the choice of the penalty function will affect the performance of a model selection criterion.Hence,there is a need to find a date-oriented penalty so that a procedure with its use will perform well. The first attempt to provide a date-oriented penalty function is made in Rao and Wu (1989),which was applied to model selection problems in the multiple regression model.In this paper,our object is to pursue the investigation started in Rao and Wu (1989) and make some refinements,We consider the problem of model selection in the classical regression model based on cross-validation with an add penalty term for penalizing overfitting. Under some weak considers,the new criterion is shown to be strongly consistent in the sense that with probability one for all large n. As shown in our Monte Carlo simulation,the criterion with a date-oriented penalty provides improved performance over the criterion with a fixed choice of the penalty function and it works well in small sample sizes.
Keywords/Search Tags:AIC, BIC, GIC, Consistency, Cross-validation, Linear regression, Model selection, Variables selection, Sub-Gaussian distribtion, Monte Carlo
PDF Full Text Request
Related items