Font Size: a A A

Study On Estimators In Linear And Generalized Linear Models

Posted on:2015-09-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:J W HuangFull Text:PDF
GTID:1220330422971466Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Linear model and generalized linear model are two important statistical models.Many phenomena in biology, medicine, economics, management, geology, meteorology,agriculture, industry and so on can be depicted by linear model or generalized linearmodel. Parameter estimation is one of the most important and difficult problem in linearmodel and generalized linear model. In this dissertation, we mainly study the biasedestimator of coefficient vector in linear model and generalized linear model.For linear model with autocorrelated errors or heteroscedastic errors, wegeneralize a principal component two-parameter estimator for regression coefficientvector. Then we give detailed comparisons between those estimators that can be derivedfrom the principal component two-parameter estimator such as the generalized leastsquares estimator, the principal components estimator, the r-kestimator, the r destimator and the two-parameter estimator by the mean squared error matrix criterion.Also, we investigate the performance of those estimators with respect to mean squarederror criterion. Finally, we give some test statistics to verify if the conditions for thesuperiority of one estimator over the other are indeed satisfied.For the classical Gauss-Markov model, we give detailed comparisons betweenthose estimators such as the principal component two-parameter estimator, the principalcomponents estimator, the r-kestimator, the r destimator, the two-parameterestimator and the ordinary least squares estimator by the average loss criterion based onthe generalized Mahalanobis loss function and obtain conditions for the superiority ofone estimator over the other. Considering the average loss criterion and the meansquared error criterion simultaneously, we propose the optimal values for ridgeparameter k of the r-kestimator and Liu parameter d of the r destimator andsuggest choosing the r-kestimator, the r destimator and the principal componentsestimator as the optimal alternative to the ordinary least squares estimator underdifferent conditions.In Chapter4, we study the superiority of two linear estimators under the Pitman’scloseness criterion based on the Mahalanobis loss function and give detailedcomparisons among the ridge estimator, the Liu estimator and the ordinary least squaresestimator. Performances of the r-kestimator, the r destimator and the principalcomponents estimator according to the Pitman’s closeness criterion based on the generalized Mahalanobis loss function are similarly compared in detail. Also, thesuperiority of the r-kestimator, the r destimator and the principal componentsestimator to the ordinary least squares estimator under under the Pitman’s closenesscriterion based on the generalized Mahalanobis loss function are investigated.For generalized linear model, to combat multicollinearity in generalized linearmodel, we propose a two-parameter estimator as an alternative to the maximumlikelihood estimator. The existence of k0and0d1, such that the mean squarederror of the two-parameter estimator being smaller than the mean squared error of themaximum likelihood estimator is shown. Some properties of the two-parameterestimator under the mean squared error criterion and the mean squared error matrixcriterion are derived. Necessary and sufficient conditions for the superiority of thetwo-parameter estimator, the ridge estimator and the Liu estimator over the maximumlikelihood estimator are obtained. Also, sufficient conditions for the superiority of thetwo-parameter estimator over the ridge estimator and the Liu estimator in the meansquared error matrix sense are derived. Furthermore, several methods and three rules forchoosing appropriate shrinkage parameters are proposed.
Keywords/Search Tags:Linear models, Generalized linear modle, Biased estimator, Mean squarederror matrix criterion, Pitman’s closeness criterion
PDF Full Text Request
Related items