Font Size: a A A

The Tail Essential Dependence And Others Based On Kullback-Leibler Divergence

Posted on:2019-11-13Degree:MasterType:Thesis
Country:ChinaCandidate:L Y ZhangFull Text:PDF
GTID:2370330566984563Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
The research of dependence is an important problem in probability and statistics,and has accumulated a lot of scientific achievements since early last century.Many scholars make great contribution to the construction and test of dependence for random variables or random vectors.Kullback-Leibler divergence which is a basic function in the theories of information and prob-ability expresses the information of difference between two functions and is applied widely.In terms of dependence,for X =(X1,X2,...,Xn)T,let's divide X into {X(1),...,X(m)}(m ? n).Let the two functions in Kullback-Leibler divergence be the joint density function of X and the product of the marginal density functions of X(1),...,X(m)that is the density function when X(1),...,X(m)are mutually dependent called dependent function,respectively.That Kullback-Leibler divergence shows the difference between the joint function and the dependent function,which contains the information of dependence among X(1),...,X(m).Moreover,Kullback-Leibler divergence can be described as the functional of the density of Copula.It is well known that the metric of dependence based Copula is invariant with regard to the monotonic transfor-mation and can be applied to measuring more general structure of dependence,such as nonlinear,asymmetric and tail dependence,etc..Hence,Kullback-Leibler divergence is rooted deeply in scientific practice and should be further applied in the research of essential dependence.Based on Kullback-Leibler divergence,firstly,this paper studies the tail essential depen-dence among X(1),...,X(m),under the condition that X(i)>U(i),i= 1,...,m.In the cases of normal distribution and gamma distribution,under the different groupings,the relationships between the tail essential dependence and the corresponding parameters are analyzed by Monte Carlo simulation.Secondly,using two groups of random vectors with the same dimension,de-fine two extended groups of vectors,to calculate the amount of essential dependence within each extended group of vectors minus the sum of the amounts of essential dependence of the two original groups of random vectors.At last,the essential dependence among random vec-tors whose joint density function is the convex combination of two density functions under the different groupings,the essential dependence between a random variable and a random vector with some linear correlation and the essential dependence between a continuous random vari-able and a discrete random variable which satisfies a certain collective risk model are obtained,respectively.
Keywords/Search Tags:Kullback-Leibler divergence, Essential dependence, Copula, Monte Carlo simulation
PDF Full Text Request
Related items