Font Size: a A A

Estimation Based On Distance Of Density For ARCH Models

Posted on:2008-11-04Degree:MasterType:Thesis
Country:ChinaCandidate:M PengFull Text:PDF
GTID:2120360215452642Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Estimation Based on Distance of Density for ARCH ModelsARCH model have become the most popular and extensively studied financial econometric models in past twenty years. Chandra and Taniguchi (2006) proposed the minimumα-divergence estimation for ARCH models. In this paper, we propose the Estimation Based on Distance of Density for ARCH Models using techniques of the minimumα-divergence estimation for ARCH models and prove that the proposed estimator is consistent.In the first chapter, we introduce several basic concepts.Firstly, a class of ARCH(p) processes is characterized by the equations where {εt} is a sequence of i.i.d.(0,1) random variables with probability density g(x) andεt is independent of Xs, s<t. It is assumed that b0>0, bi≥0, 1≤i<p, and b1+...+bp<1.Secondly, we introduce the concept ofα-divergence functionals. We propose a postulated model fθ,θ∈H, where H is a compact subset of Rk and g is a true probability density. Theα-divergence from fθto g is given by whereKα(z)={4/(1-α2)}{1-z(1+α)/2}, -1<α<1.The limit of (1.2.1) asα→-1 reduces to the well-known Kullback-Liebler information:α=0, the 0-divergence is the Hellinger distance,Lastly, we present the introduction of kernel density estimator.In the second chapter, we introduce three various distance of density and propose the Estimation Based on these Distances of Density for ARCH Models using techniques of the minimumα-divergence estimation for ARCH models and prove that the proposed estimator is consistent.Based on the Hellinger distance, we give three various distance of density as follow.whereΦ(x) is a bounded probability density which is even and monotone decreasing in positive side. Takingas example, we construct Estimation Based on Distance of Density for ARCH Models and prove that this new estimator is consistent.At first, we present (?)n(x) which is the nonparametric kernel density estimator of the innovation density g(x) according the techniques of the minimumα-divergence estimation for ARCH models.Then, we give the definition of T1 based on D1(fθ,g). A functional T1 defined on (?) is determined by the requirement that for every g∈(?),Here, denote by (?) the set of all bounded probability densities with respect to Lebesque measure on R, ft is a postulated model, t∈H, where H is a compact subset of Rk. T1(g) may be multiple-valued to indicate any one of the possible values, chosen arbitrarily.Under assumption 2.2.1, we establishes the existence, continuity and uniqueness of T1(g) in theorem 2.2.1.Assumption 2.2.1(ⅰ) fθ(x) and g(x) are uniformly bounded continuous, and are integrable functions with respect to x.(ⅱ) fθ1/2 and g1/2 are integrable functions on R.(ⅲ) g has mean 0 and variance 1.Theorem2.2.1 Suppose thatθ1≠θ2 implies fθ1≠fθ2 on a set of positive Lebesgue measure, and for almost every x, fθ(x) is continuous inθ. Then (a) For every g∈f, there exists a value T1(g)∈H satisfying (2.2.4).(b) If T1(g) is unique, and if gn∈(?) and is uniformly bounded, gn(?)g, then T1(gn)→T1(g), as n→∞.(c) T1(fθ)=θuniquely for everyθ∈H.Thus, we present T1((?)n) as the estimator of T1(g), and under assumption 2.2.2 prove that this new estimator is consistent in Theorem 2.2.2.Assumption2.2.2(ⅰ) W is even, and twice continuously differentiable with compact support Sw.(ⅱ) W″is bounded.(ⅲ) cn= O(n-λ) with 1/4<λ<1/3.Theorem 2.2.2 Suppose that Assumption 2.2.1 and 2.2.2 hold, then(a)‖(?)n-g‖(?)0, as n→∞.(b) If T1(g) is unique, then T1((?)n)(?)T1(g), as n→∞.This theorem is proved by Chandra and Taniguchi.In the last chapter, we give the estimation value (?)n(T1((?)n))ofθusing random data and test the approximation between f(?)n(x) and g(x) by graph.
Keywords/Search Tags:Estimation
PDF Full Text Request
Related items