Font Size: a A A

Empirical Likelihood Inference For Two Regression Models With Missing Data

Posted on:2007-08-23Degree:MasterType:Thesis
Country:ChinaCandidate:H F QiFull Text:PDF
GTID:2120360212973261Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
In many scientific areas a basic task is to assess the simultaneous influence of seve- ral factors(covariates) on a quantity of interest(response variable).Regression models provide a powerful framework, and associated parametric and non-parametric inferen- ce theories are well established. Let X and T be vectors of factors and let Y be a resp- onse variable, their dimensions are p, 1and 1,δis a indicator function, In practice, we often obtain two kinds of random samples of incomplete data as follows:(1)(Xi,Yj,δxi,δyj ), i = 1,...,n, where all the X i, Yi have missing, 0δXi= if X i is missing, 1δXi= otherwise; 0δYi= if Yi is missing, 1δYi= otherwise. Throughout this article, we assume that X, Y are missing at completely random(MCAR). That is, ( )P (δX = 1| X , Y ) = PδX= 1= p1, ( )P (δY = 1| X , Y ) = PδY= 1= p2.(2) ( Yi ,δi , X i ,Ti), i = 1,...,n, where the X i, Ti are observed andδi=0 if Yi is missing,δi=1 otherwise. Throughout this article, we assume that Yi is missing at random (MAR). That is, P (δ= 1/ Y , X , T ) = P (δ= 1/ X ,T). In these cases, the usual inference procedures cannot be applied directly.A common method for handling incomplete data is to impute a value for each miss- ing variables and then apply standard methods to the complete data set as if they were true observations. The imputation methods we obtained is the linear regression imput- ion.Under the samples (1), we consider the linear model Y = X 'β+ν0( X)ε, We get the estimator for the mean of the response variable Y and obtain a adjusted empirical log-likelihood ratio l^ ad(θ0)(Theorem 1.1, whereθ0is real parameter), which is asymptotically standard chisquared. Under the samples (2), we consider the partly linear model Yi =Xi'β+g(Ti) + ei, i = 1,...,n, We obtain a adjusted empirical log-likelihood ratio for the parameterβ,which is asymptotically standard chisquared. The main results as follows: Theorem 1.1 if E||X||2 <∞,θ0 is the real parameter, thenWe estimate the non- parameter with the non- parameter regression kernel function estimation for obtained the asymptotical distribution of in partly linear model. The bandwidth of the kernel function K (·) is h = hn. We now make some assumptions for the following results:(i) The density of T1, say r(t), exists and satisfies(ii) There exist a function m1j t = E(X1j/ T1 = t) ,1≤j≤p, on [ 0,1] . g (·)and m1jt ,1≤j≤p satisfy the Lipshitz condition of order 1.(iii) Let ? be Euclidean norm, positive-definite matrix.(iv) Ee14<∞.(v) There exist constants M1 , M2> 0, andρ> 0 such that where K (·) is a bounded variation kernel function on [-ρ,ρ].(vi) nh 2 / log n→∞, nh4→0.(vii)positive-definite matrixes, where P ( X , T ) = E (δ| X ,T).(viii) Theorem 2.1 Under the conditions of(i)-(viii),,and if H0 :β=β0.Then...
Keywords/Search Tags:missing data, linear model, partly linear model, empirical likelihood
PDF Full Text Request
Related items