Font Size: a A A

Difference-based K-d Estimator Of Parameters In A Partial Linear Model

Posted on:2017-01-27Degree:MasterType:Thesis
Country:ChinaCandidate:F F HuangFull Text:PDF
GTID:2180330485962368Subject:Statistics
Abstract/Summary:PDF Full Text Request
The partial linear regression model was firstly proposed by Engle et al(1986) to analyze the relationship between temperature and electric demand. It is a very important statistic model.Consider the following partial linear regression modely =X b +f +e,where y and X are observations, b is an unknown parameter, f is an unknown smooth function, and the e is independent and identically distributed(i.i.d.) random variables.The difference approach can eliminate the trend in the data that arises from the nonparametric part. Such method does not require an estimator of the nonparametric part and is often called difference-based estimators.After differencing,partially linear regression models are transformed into linear regression models, and maybe appear multicollinearity in the design matrix.In order to combat multicollinearity,our primary aim in this paper is to introduce a difference-based dk- estimator of parameters.In chapter 2, After defining difference-based dk- estimator of the parameters in a partial linear model,we give some properties of the estimators.And then, we compare the difference-based dk- estimator with the difference-based ridge estimator under the mean squared error(MSE) and the mean squared error matrix(MMSE).In chapter 3,We investigate the selection of biasing parameters.In chapter 4,Two numerical examples are given to illustrate these theoretical results.
Keywords/Search Tags:Difference-based dk-estimator, Partial linear model, Mean squares error, Mean squares error matrix
PDF Full Text Request
Related items