Font Size: a A A

Estimations And Tests Of Autoregressive Models

Posted on:2012-04-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z W ZhaoFull Text:PDF
GTID:1100330335951983Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
As an important branch of probability theory and mathematical statistics, time series analysis has wide application in economic field,society field,natural field and other fields. From the perspective of whether that a time series model is linear, the model which is currently being studied can be divided into two categories:linear time series model and nonlinear time series model. Constant coefficient autoregressive model is the most common linear time series model. Observe that many economic relation-ships are dynamic in nature. When we establish time series model to describe this kind of dynamic relationship, we find that this kind of dynamic variable is time lagged, and that it is related to some fixed variable. Based on the above reasons, Koopmans et al. (1950) and Anderson and Rubin(1950) considered the following univariate pth order, autoregressive scheme with q explanatory variables whereβ=(β1,β2,…,βp),α=(α1,α2,…,αq), Yt-1=(Yt-1,…,Yt-p)τand Zt=(Z1t,…,Zqt)τ.τdenotes the transpose of vector or matrix.αandβare unknown parameter vectors, Zt are explanatory variables, and {εt} is the random disturbance sequence.Although the form of linear time series model is simple and theoretical results is quite perfect, Kendall (1953) indicated that, when modeling economic time series, nonlinear models often fit data more satisfactorily than linear models. Therefore in the past thirty or forty years, nonlinear time series models attracted the attention of many statisticians and a lot of research results have been made. At the same time, random coefficient model was paid more attention by economists. Random coefficient autore-gressive model is currently the most widely studied. Univariate, pth order random coefficient autoregressive model has the following form: where Y(t-1)=(Yt-1,…,Yt-p)τ,Φt=(Φt1,…,Φtp)τis a p×1 vector of random coefficient,E(Φt)=(Φ1,…,Φp)τandεt is random error.Random coefficient autoregressive model has wide application in many fields,one typical example is to use two-order random coefficient autoregressive model to fit Cana-dian lynx data,fitting results indicate that two-order random coefficient autoregressive model give a smaller mean square error than high-order linear autoregressive model.For the more importance of random coefficient autoregressive model,we refer to Nicholls and Quinn(1982)or Tong(1990).In this paper,we mainly discuss the parameter estimators and hypothesis tests of autoregressive model with explanatory variables and random coefficient autoregressive model.In what follows,we introduce the main results of this paper.Firstly,we use the empirical likelihood method to discuss the statistical inference of parameters in autoregressive model with explanatory variables.Specifically,consider the following pth order.univariate,autoregressive scheme with q explanatory variables: whereβ=(β1,β2,…,βp),α=(α1,α2,…,αq),Y(t-1)=(Yt-1,…,Yt-p)τ,Zt= (Z1t,…,Zqt)τ;αandβare unknown parameter vectors to be estimated,the Zts are explanatory variables,and (?)t-1 is theσ-field of events generated by (Yτ(0),Y1,…,Yt-1).Suppose that the recorded data {Y(0),{Z1,Y1},…,{Zn,Yn}} are generated by the model (1).Let Jt=(?),Σn=E((?)εt2Jt),and ht(β,α)=Jt(?)-(?). By,using the estimation equation of conditional least square estimator,we can establish the following empirical likelihood ratio statistics whereλsatisfiesAssume that(β0,α0)is the true value of(β,α).To obtain the limiting distribution ofι(β0,α0),the following assumptions are made.Assumption 1.1 There exists anη<1,such that all the solutions of xp-β1xp-1-…-βp=0 are less thanηin modulus. Assumption 1.2 For any p+q dimension unit vector b and for (?)ε>0, and where Xt=εt(Yτ(t-1),Ztτ)b and sn2=E((?)Xt2).Assumption 1.3Σn/n→Σ,as n→∞,andΣ>0.Assumption 1.4 (ⅰ)sup E(‖Zt‖4)0. (ⅱ)sup E(‖εt‖4+ε)0 andε>0.Assumption 1.5 E‖Y(0)‖40. The limit distribution ofι(β0,α0) is given by the following theorem.Theorem 1 Assume that 1.1-1.5 hold and that {Zt,t≥1} is a stochastic process independent of {εt,t≥1}.Then where xp+q2 is a chi-squared distribution with p+q degrees of freedom.Secondly,we use the empirical likelihood method to discuss the statistical infer-ence for parameters in pth-order generalized random coefficient autoregressive model. Specifically,consider the following modelwhereΦt=(Φt1,…,Φtp)τis a p×1 vector of random coefficient,Y(t-1)=The problem of interest here is to estimate the unknown parameterΦandβ(k×1)= (σε2,σΦ5τ,ν11,ν22,…,νpp,ν12,ν23,…,νp-1,…,ν1p)τ,where k=1/2(p+1)(p+2)andυij denotes the(i,j)th element of VΦ. Let and Before we state our main results,the following assumptions are made.Assumption 2.1 All the eigenvalues of the matrix E(Ct(?)Ct)+(B(?)B)are less than unity in modulus.Let Gt(Φ)=YtY(t-1)-Y(t-1)YT(t-1)Φ.By using the estimation equation of least square estimator,we can establish the following log cmpirical likelihood ratio statistics whereλsatisfiesAssume thatΦ0 is the true value ofΦ. Below,we will show thatι(Φ0)converges to the chi-squared distribution with p degrees of freedom.Theorem 2 Assume that 2.1 holds and EY(?)4<∞.Then whereχ(?)2 is a chi-squared distribution with p degrees of freedom.Let Rt(Φ)=Yt-E(Yt|Y(t-1)) andαt(β)=E(Rt2(Φ)|Y(t-1)).Further,we denote the conditional least square estimate ofΦbyΦ.For t=1,2,…,n,we define the k×1 vectorXt=(1 2Yt-1...2Y-p+t Yt-12...Yt-P2 2Yt-1Yt-2…2Y-p+t+1Y-p+t…2Yt-1Y-p+t)τ. and Ht(β)=XtXtτβ-XtRt2(?).Then we can establish the following log empirical likelihood ratio statistics whereη∈Rk satisfiesAssume thatβ0 is the true value ofβ.The following theorem indicates that the limit distribution ofι(β0)is a chi-squared distribution.Theorem 3 Assume that 2.1 holds and EY18<∞.Then where xk2 is a chi-squared distribution with degrees of freedom.In what follows,we discuss the test of stationary-ergodicity and parameter change test of one-order generalized random coefficient autoregressive model.For the test of stationary-ergodicity,we consider the following one-order model where (Φt,εt)τis a random variable,E(?)=(?) and Var(?)=(?) Further,we assume that (Φt,εt)τis independent of (?)t-1=σ(Yt-1,Yt-2,…).Before stating our main results,we make the following assumption:Assumption 3.1Φ2+Var(Φt)<1.Letθ=Φ2+Var(Φt). In order to obtain the test statistic,we consider the estimation ofθ.Letγ=(θ,σε2,σΦε)τ.A conditional least-squares estimator (?) ofγcan bc obtained by minimizing where Nt=(Yt-12,1,2Yt-1)τ.Solving dQ/dγ=0 forγ,we obtain The following theorem gives the limit distribution of (?).Theorem 4 Assume that 3.1 holds.If EY18<∞,then where L=E(N1N1τ) and M=E(N1N1τ(Y12-N1τγ)2).By the above estimation (?) ofγ,a natural estimation ofθis given byθ=Tτ(?), where T=(1,0,0)τ. Corollary 1 Assume that 3.1 holds.If EY18<∞then we have where Dn=Tτ(1/n(?)NtNtτ)-1(1/n(?)NtNtτ(Yt2-Ntτ(?))2)(1/n(?)NtNtτ)-1T.Consider the following test for the hypothesisH0:θ≤1-h against H1:θ>1+h, where 0T(?)H1:Φt changes at some t>T.Before stating the main results,we make the following assumptions:Assumption 4.1 The distributions ofΦt andεt are absolutely continuous with respect to the Lebesgue measure on R1 and their densities are strictly positive on some neighborhood of 0.Assumption 4.2Φ02+VΦ,0<1.Assumption 4.3 E(Φt6)<1 and E(εt6)<∞.Letθt=Φt2+VΦ,t andθ0=Φ02+VΦ,0.If the assumptions 4.2 and 4.3 hold under H0,then ((?)Yt-1Yt-1)-1((?)YtYt-1)(denoted asφk) is the consistent estimation ofΦ0.φk is asymptotic normal and its asymptotic variance is J=σε,0-2(1-θ0)2(σε,02EY02+ 2σΦε,0EY03+(θ0-Φ02)EY04) (Hwang and Basawa,1998).In what follows,we consider to estimate J. By the ergodic theorem,it is easy to find that:for any integerι>1,1/n(?)Yt-1ιis the consistent estimator of EY0ι.Let Nt=(Yt-12,1,2Yt-1)τ,Υt=(θt,σε,t2,σΦε,t)τ,Υ0=(θ0,σ(ε,0)2,σΦε,0)τ.In order to obtain the consistent estimator Jk of J.we only need to obtain the consistent estimator of Υ0=(θ0,σ(ε,0)2,σ(Φε,0))τ.By theorem 4,((?)NtNtτ)-1(?)Yt2Nt is the consistent estimation ofΥ0.After obtaining the consistent estimator ofΦ0 and J,we can establish the following test statistics: For the test statisticsτ1(T),we have the following results:Theorem 5 Assume that 4.1-4.3 hold.(ⅰ)If g(s)=cg1(s),s∈(1,∞),c is a positive constant and g1 is a given continuous real-valued function with (?) g1(s)>0,thenParticularly,if g(s)=c and‖·‖=‖·‖2,then(ⅱ)If g(s)=(?),s∈(1,∞) and‖·‖=‖·‖∞,then(?)PH0{τ1(T)<∞}=2-2Φ(e)+2e(?)(e),where (?) andΦdenote the distribution function and density function of N(0,1),respectively.Finally,we give the limit theory for first-order random coefficient autoregressive process under martingale difference error sequence.specifically,we consider the follow-ing time series initialized at some Y0,where {εt} is a stationary ergodic martingale difference sequence with respect to the natural filtrationΗt=σ(ε1,…,εt),ρn is a sequence of real numbers,{(?)n} is a sequence of random variables,andσ(Y0),σ(Φn),σ(ε1,…,εt) are mutually independent.For convenience of illustration,we employ the abbreviated notation {Yt} and {Yt} for {Ynt} and {Yn0},respectively.Let Fnt=σ(Y0,(?)n,εi,1≤i≤t),for 1≤t≤n and n≥1.Note that E(Yt|Ht-1)=(ρn+(?)n)Yt-1.Based on (?)n and the sample Y1,…,Yn,minimizing Q=(?)(Yt-E(Yt|Fn(t-1)))2 with respect toρn,we obtain that the conditional least-squares estimator ofρn is So.we have In order to derive the limit theory,the following assumptions are made.Assumption 5.1 |ρn+Φn|<1 a.s.,E(?)→0 as n→∞.Assumption 5.2 EY02=o(n).Assumption 5.3 There existsα>0,such that (?)E(εt2+α|Ht-1)<∞a.s..Assumption 5.4 There exists 0<σ2<∞,such that E(εt2|Ht-1)<σ2 a.s..Theorem 6 Assume that conditions 5.1-5.4 hold.Then and Theorem 7 Assume that conditions 5.1-5.4 hold.Then and...
Keywords/Search Tags:autoregressive model, empirical likelihood, confidence region, the least square estimation, hypothesis test
PDF Full Text Request
Related items