Font Size: a A A

Estimations And Tests Of Heteroscedastic Time Series Models Based On Empirical Likelihood Method

Posted on:2019-04-10Degree:DoctorType:Dissertation
Country:ChinaCandidate:C X PengFull Text:PDF
GTID:1360330572452963Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Time series are a family of random variables that vary with time and are widely used in practice.Time series models can be divided into two categories:Integer-valued time series models and noninteger-valued time series models from the point,of view of whether the time series observation data are integer or not.Conditional heteroscedas-ticity is a common phenomenon in both integer and non-integer time series.Therefore,from the perspective of whether the time series observation data are integer or not,con-ditional heteroscedastic time series models can be divided into noninteger-valued condi-tional heteroscedastic time series model and integer-valued conditional heteroscedastic time series model.In this paper,based on the empirical likelihood method,we investi-gate the parameter estimate and hypothesis test problems of autoregressive conditional heteroscedastic model and threshold autoregressive conditional heteroscedastic model.In what follows,we give our main results.Firstly,we employ the empirical likelihood method to estimate the unknown pa-rameters in Poisson autoregressive model in the presence of auxiliary information.Specifically,consider the following Poisson autoregressive modelwhere ? fiele Ft-1=?(Xt-1,Xt-2,…),?0>0,?i ? 0(i = 1,2,…,p)and ? =(?0,?1,…,?p)is unknown parameter vector.We assume that,we have some auxiliary information that.can be represented as the conditional moment restrictions E(g(Xt,…,Xt-p;?0)|X(t-1))= 0,t = 0,1,2,…,where the unknown parameter vector ?0 E Rd,X(t-1)=(Xt-1,… Xt-p),g(x;?)?R? is some function with r ? d.By using the the auxiliary information,we can obtain data adaptive weights of the weighted least squares estimate based on the empirical likelihood method.Before we state our main results,the following assumptions will be made:Assumption 1.1 The parametric space T is compact with T = {?:? ? ?0 ?M,0<?1 + … ?p<M*<1,?i ? 0,i = 1,2,…,?},where ? and M are finite positive constants,and the true parameter value ?0 is an interior point in T.Assumption 1.2 There exists ?0 such that E(gt(?o))= 0,the matrix E(?)=E(gt(?)gt?(?))is positive definite at ?0,(?)g(x;?)/(?)? is continuous in a neighborhood of the true value ?0,||(?)g(x;?)/(?)?||and ||g(x;?)||3 are bounded by some integrable function W(x)in this neighborhood,and the rank of E((?)gt(?)/(?)0)is d.Note that the conditional moment restrictions imply that E(gt(?0))= 0,which combines with the empirical likelihood method,let where ?0 is unknown parameter.By using the auxiliary information,we can obtain data adaptive weights ?t.Further,combining with the least squares method,we can obtain the following weighted least squares estimate ? = arg min(?)?(Xt-Zt??)2,where Zt?=(1,Xt-1,…,Xt-p).By introducing a Lagrange multiplier ? ? R?,standard derivations in the empirical likelihood lead to?t(?0)=1/n1/1+??0gt(?0),(2)where ??0 satisfies Utilizing the weights(2),we haveIn the following,we will give the asymptotic properties of a.Theorem 1 Assume that 1.1 and 1.2 hold.If?0 is the true value of a,then(?)(?-?0)?(0,W-1(?-?12?-1(?0)?T12?)W-1),where W-E(ZtZt?),? = E(ZtZt?(Xt-Zt??0)2)and ?12 = E(Ztgt?(?0)(Xt-Zt??0)).To apply the proposed estimator(3),we need to further estimate the unknown parameter ?.Let ? = arg max? L(?).Following the results in Qin and Lawless(1994),the x corresponding weights ?t(01/n1/1+??0gt(?0),where ?? is the solution to 1/n(?)gt(?)/1+???gt(?)=0 and 1/n(?)(?)gt(?)/(?)????/1+???gt(?)0.Let?1 = arg min(?)?t(?)(Xt-Zt??)2.(4)In order to study the estimator(4),we define ?(?0)= E((?)gt(?0/(?)?)),?(?0)=(r(?0)?-1(?0)rT(?0)-1and B ??-1(?0)(I-?(?0)?(?0)rT(?0)?-1(?0)),where I is the iden-tity matrix.The limiting distribution of ?1 is given in the following theorem.Theorem 2 Assume that 1.1 and 1.2 hold.If ?0 is the true value of ?,then(?)(?1-?0)?(0,W-1(?-?12B?12?)W-1).Secondly,we consider the test for the conditional heteroscedasticity of the p-order poisson autoregressive model based on the following two test methods:a parametric test based on the maximum likelihood method and a nonparametric test based on the empirical likelihood method.Specifically,consider the following p-order Poisson autoregressive model(5)where ?0>0,?0 ? 0,i = 1,2,…,p,? field ft = ?(Xt-1,Xt-2,…).In what follows,we consider the test for the conditional heteroscedasticity of the above p-order poisson autoregressive model based on the recorded data {X1-p,…,?X0,…,Xn}.For convenience,let ? =(?0,?1,…,?p)?,T =(0,1,1,…,1)T,Zt =(1,Xt-1,…,Xt-p?.Before stating our methods and main results,we make the following assumption.Assumption 2.1 The parametric space ? is compact with? = {? ?(?0,?1,…?p):? ? ?0 ? M,0 ? ?1 + …?p ? M*<1}where ?,M and M*are finite positive constants,and the true parameter value a0 is an interior point in ?.We first use the maximum likelihood method to establish the test statistic.In order to test the conditional heteroscedasticity of model(5),consider the following hypothesis test:H0:? ? ?0 vs.H1:? ? ?—?0,(6)where ?0 = {(?0,0,…,0):(?0,0,…,0)? ?}Let Pt(?)= ?tXte-?t/XT!,where ?t = ?0 +? ?iXt-i.Based on the observation data,the conditional log-likelihood function can be written as If ? ? ?,the maximum likelihood estimation a of a is the solution of the likelihood equation(?)l(?)/(?)? 0.If the null hypothesis is true,that is,the parameter value ?0 ??0,then the maximum likelihood estimation ?0 = g(?0).Note that a = g(?0)and Pt(a)= Pt(g(?o)).Further denote Pt(?)as Pt(?0).Consequently,Pt(?0)=?0Xte-?0/Xt!,and ?0 is the solution of the likelihood equation(?)l(?0)/(?)?0,whereA simple calculation shows that ?0=(?).By the strong law of large numbers for independent and identically distributed random variables,it is easy to see that?0??0 as n ? ?.For the test problem(6),we define the following likelihood ratio test statisticFor the test statistics ?,we have the following theorem:Theorem 3 Assume that 2.1 holds.Then under H0,for any t>0,we have lim P{2log? ?t} = P{x2(1)? t},where?2(1)is a chi-squared distribution with one degree of freedom.Next,by using the empirical likelihood method,we establish the test statistic to test the conditional heteroscedasticity of model(5).Let ? =(?).Consider the following hypothesis test:H0:? = 0 vs.H1:?>0.For this,we first consider the estimation of?.We can minimize Q(a)?(?)(Xt-Zt??)2 with respect to ca to obtain the conditional least-squares estimator ? of ?.Solving xii dQ/d? = 0 for ?<=(?)-1()XtZt.Let ?0 =(1,0,…,0)??.Then ?0 is a consistent estimator of ?0.Further,let ?*=(?0,?1,…,?p)T.Then,the estimating equation of ? can be written as Let Ht(?)= XtZt?(1/n(?)Z?Zt?)-1T-?.According to Owen(1988),the empirical like-lihood function can be constructed as Using the standard Lagrange multiplier arguments,the optimal value of pt is found to be Pt = 1/n1/1+?(?)Ht(?)=0,where A(?)satisfiesTherefore,ignoring the constant term-nlog n,the empirical log-likelihood function is defined asIn order to test H0,we define the following empirical likelihood ratio statistic T=-2logL(0)/sup??0L(?).For the test statistics T,we have the following theorem:Theorem 4 Assume that 2.1 hold.Then under H0,for any t>0,we have lim P{(?)?t}=1/2P{?2(1)?t} + 1/2 where ?(1)is a chi-squared distribution with one degree of freedom.Lastly,we consider the parameter estimate problem of the one order threshold autoregressive conditional heteroscedastic model Xt = ?1Xt-1+ + ?1X-1-+ ?t(7)where ?t ?(?),ht = ?0 + ?1(?t-1+)2 +?2(?t-1-)2,et is a sequence of independent identically distributed random variables with Eet = 0,Var(et)= 1;?0,?1 and a2 are model parameters with ?0>0,0? ?j<1,j = 1,2;Xt+ = max(Xt,0),Xt-=min(Xt,0).In what follows,we use the empirical likelihood method to estimate the model parameter.Before we state our main results,the following assumptions will be made:Assumption 3.1 The probability density function f(·)of ?t has the support in(-?,?),and satisfies ?max +(?)<1,where ?max = max{|?1|,|?2|},?max ?max??1,?2?.Assumption 3.2 E(Xt6)<?.Let ?=(?1 ?2)?,Xt=(Xt-1+,Xt-1-)? = Ht{?)=(Xt-Xt??)Xt.By using the estimate equation of least squares estimate(?)(Xt-Xt??)Xt = 0,we can obtain the following profile empirical likelihood ratio function Using the standard Lagrange multiplier arguments,the optimal value of pt is found to be Pt = 1/n1/1 + b?(?)Ht(?),where b(?)satisfies So we have-2log(L(?))=(?)log(1+b?(?)Ht(?)).Now we give the limiting distribution of L(?).Theorem 5 Assume that 3.1 and 3.2 hold,Then,as n??,-2log(L(?))? ?2(1),where ?2(1)is a chi-squared distribution with one degree of freedom.
Keywords/Search Tags:conditional heteroscedastic model, auxiliary information, empirical likelihood, con-fidence region, the weighted least square estimation
PDF Full Text Request
Related items