| As an independent branch of social science,finance has undergone three major revolutions since its birth.The first financial revolution was in 1952,Markowitz[63]proposed the modern portfolio theory(MPT)based on mean-variance analysis,which marked the birth of modern economic and financial theory.In[63],Markowitz proposed to measure the risk of expected return of a security by means of the variance,and to characterise the risk level of the portfolio via the covariance between any two securities in the portfolio.The starting point of the second financial revolution was the continuous time model.In 1969,Merton[65]put forward the optimal portfolio theory under the continuous time model.Subsequently,in 1973,Black and Scholes[9]and Merton[66]respectively obtained the pricing formula of European stock options by using the con-tinuous time model.The continuous time model provides a theoretical basis for solving the problem of option pricing and other related problems of financial derivatives.The most recent financial revolution,the third financial revolution,emerged in 1997,with coherent risk measure theory proposed by Artzner et al.[2,3],which is also the main issue studied in this paper.In fact,with the development of financial markets and the innovation of financial derivatives,the types of financial risks faced by financial companies(e.g.,banks and insurance companies)are increasing,such as market risks,credit risks,operational risks,model risks and liquidity risks[64].How to find an integrated risk measure model to comprehensively consider all types of financial risks and their interactions,effectively manage and hedge risks by repackaging them and transferring them to markets becomes more and more important.Therefore,risk measurement is the core competitiveness of financial companies.In 1996,the Basel Committee on Banking Supervision promulgated an amendment to Basle I passed in 1988(the 1996 Amendment)[6],which stipulated that banks and their supervisory authorities use Value at Risk(VaR)as a tool to measure market risk,and formulated the minimum standard for calculating banks’ capital charge by VaR.However,as a widely used integrated risk measure model,more and more scholars have pointed out the deficiencies of VaR,refer to Daykin et al.[20],Embrechts et al.[32],Artzner et al.[3],Acerbi and Tasche[1],Tasche[89]etc.On one hand,VaR can only control the probability of loss,but cannot measure the specific scale of loss once rare events occur.More importantly,VaR usually does not satisfy the axiomatic character-istics of coherent risk measures proposed by Artzner et al.[3],that is,it does not have subadditivity.This is also the reason why diversification is usually discouraged by VaR,that is,the overall risk of the portfolio is greater than the sum of the individual risks of each asset in the portfolio,see Chapter 1 for a specific example.On the other hand,the calculation of VaR relies on the probability distribution of financial products,and when the probability distribution is uncertain,VaR cannot measure risk well.According to the famous distinction given by Knight[56],there are two types of uncertainty in financial markets.The first type of uncertainty,known as "risk",corresponds to situa-tions that every financial product has a clear probability distribution of gain or loss,and each market participant can reach a consensus on it.The second type of uncertainty,called "Knightian uncertainty",or "ambiguity" in Ellsberg[31],corresponds to situa-tions where the gain or loss of a financial product does not have an explicit probability distribution agreed upon by all market participants.In other words,the attitudes of market participants towards the possible gain or loss of the same financial product cor-respond to a set of probability measures P:={P1,P2,…}.In 1961,Ellsberg proposed the famous Ellsberg’s Paradox in order to clearly explain the difference between risk and uncertainty.Therefore,how to find a coherent alternative to VaR that measures risks with uncertainty has become a financial and mathematical problem with important practical significance.Delbaen[23]extended coherent risk measures to the general probability space.Follmer and Schied[38,39,40]and Frittelli and Rosazza Gianin[41]studied the more general case and proposed the concept of convex monetary risk measures.In order to quantitatively analyze and calculate uncertainty in financial markets,Peng[72,74,75],jumped out of the classical Kolmogorov probability axiom system(Ω,F,P),and started from the perspective of expectations,establishing the sublinear expectation theory frame-work(Ω,H,E).Peng[75,76,80]gave the definitions of distribution and independence in the sublinear expectation space(Ω,H,E),defined two completely new distributions,maximal distributions and G-normal distributions,obtained the law of large numbers and the central limit theorem,and introduced the most important sublinear expection space,G-expectation space.In fact,the coherent risk measure introduced by Artzner et al.[3]and Delbaen[23]is essentially a sublinear expectation,while Peng’s sublinear expectation has a more prominent advantage over the coherent risk measure.The ad-vantage is that the singular probabilities are considered,which makes Peng’s sublinear expectation a wider application space.Merton[67]pointed out that "time and uncertainty are the central elements that influence financial economic behavior".Static risk measures alone cannot accurately depict the impact of financial markets’ dynamic information on financial risks.Peng[70]introduced the concept of g-expectations by studying a class of nonlinear backward stochastic differential equations(BSDEs),and obtained a class of time-consistent dy-namic risk measures,g-risk measures,refer to Delbaen et al.[25],Peng[73],Rosazza Gianin[85].In addition,Artzner et al.[4],Delbaen[24],Riedel[83],and Roorda et al.[84]gave examples and characteristics of coherent risk measures that satisfy time consistency.In the framework of decision theory.Epstein and Zin[35],Duffie and Ep-stein[29],Wang[94]arid Epstein and Schneider[34]investigated the time consistency of preferences.Therefore,we want to systematically study the various conditions that can guarantee the time consistency of dynamic coherent risk measures,analyze the con-nections and differences between them,and find out the simplest dynamic coherent risk measure that can guarantee the time consistency.On the other hand,with the rapid develvopment of financial technology(FinTech)and the extensive application of innovative technologies such as big data,cloud comput-ing,artificial intelligence and block chain,the data generated in financial markets has achieved explosive growth,in which any small difference can add up to immeasurable financial risks.As mentioned earlier,these massive financial data contains the uncer-tainty cannot be ignored,which makes the assumptions of independence and identical distribution under the classical probability framework no longer applicable.Therefore,how to carry out reasonable mathematical modeling of these financial data,give a new independence hypothesis that takes uncertainty into account,and use dynamic risk mea-sures to quantitatively analyze and calculate the limit behavior of financial data,so as to grasp the limit state of financial risks,has become an urgent problem to be solved.In fact,just as the laws of large numbers and the central limit theorems occupy an impor-tant position in the classical probability and statistical theory,the study of limit theory under the nonlinear framework has always been a fundamental and important issue of concern to economists and mathematicians.Related works can refer to Marinacci[62],Peng[71],Maccheroni and Marinacci[61],De Cooman and Miranda[21],Peng[78],Peng[80],Li and Shi[58],Chen et al.[17],Chen and Hu[15],Hu and Zhou[53],Chen[11],Zhang[97,98,99],Hu[50],Chen and Epstein[13]etc.Inspired by the above problems and related works,this paper mainly studies time-consistent dynamic coherent risk measures and their limit theory.The research work is divided into seven chapters.The main framework and results are as follows:In Chapter 1,we mainly study the descriptions of the time consistency of dynamic coherent risk measures.We first review the basic knowledge of risk measures,give the definition and representation theorem of coherent risk measures,and define the time consistency of dynamic risk measures.We give examples to illustrate shortcomes of two common risk measure tools,VaR and ES.Then,in order to study the various conditions for dynamic coherent risk measures to meet the time consistency,we study six different risk measures,Stability models,Rectangularity models,IID models,BU models,and g-expectations,sublinear expectations,from the perspective of probability and expectation respectively,and give the connections and differences among them,which lay a foundation for the follow-up research works.In Chapter 2,we mainly study the laws of large numbers for dynamic coherent risk measures.In the first part,starting from the general dynamic coherent risk measures,we just assume that the time consistency is established,without considering the specific representations of risk measures.Three different forms of the laws of large numbers are given to the average value of the portfolio,which together describe the limit behavior of portfolio risks,and provide a new theoretical basis for the numerical calculation of portfolio risks.In the second part,we use the Stability model and the g-expectation separately to induce two different time-consistent dynamic coherent risk measures,and obtain the corresponding laws of large numbers.In addition,we also study the exis-tence and uniqueness of the time-consistent dynamic coherent risk measures induced by the Stability model,and evaluate the risks of financial assets driven by geometric Brownian motions using the time-consistent dynamic coherent risk measures induced by g-expectations.In Chapter 3,we mainly study the law of large numbers of an array of random variables under the Stability model.Based on the time-consistent dynamic coherent risk measures induced by the Stability model in Chapter 2,we generalize the main results of Chapter 2 and obtain the law of large numbers of an array of random variables.At the same time,we give the definition of m-dependence under the Stability model,and use the law of large numbers of an array of random variables to give the corresponding law of large numbers of a sequence of random variables satisfying the m-dependence hypothesis.In Chapter 4,we mainly study the central limit theorem under the BU model.After completing the research on the laws of large numbers for dynamic coherent risk measures in the previous two chapters,in this chapter,we consider one of the simplest Stability models,the BU model.There are two main research objects in this chapter,one is the set P of probability measures corresponding to the BU model,and the other is the set A of all predictable processes in the classical probability space only taking values in {σ,σ}.First we prove a special form of time consistency for P,and obtain a similar result for A.Then we use P and A to construct two subadditive functionals,and prove that they all satisfy the dynamic programming principle.Finally,under the assumption that the random variables satisfy the Lindeberg’s condition,we prove the central limit theorem for dynamic coherent risk measures induced by the BU model using the obtained dynamic programming principles,which establishes the connection between P and A.This central limit theorem we obtained considers not only the influence of variance uncertainty,but also the influence of mean uncertainty,so it can be regarded as a new attempt of central limit theorems in the field of dynamic coherent risk measures(or sublinear expectations).In Chapter 5,we mainly study the decomposition theorem of G-Brownian mo-tions.Inspired by the results of the previous chapter,in this chapter we consider the decomposition of G-Brownian motions in the of identical distribution.We first review definitions and related properties of Ocone martingales in the classical proba-bility framework and the definiton of G-Brownian motions in the G-expectation space proposed by Peng.After that,we further study the stochastic integral representation of G-Brownian motions in the classical probability framework given by Denis et al.[26],and obtain a more detailed characterization,which proves that the maximum distri-bution of random integrals of all processes taking values in[σ,σ]is the same as that of processes taking values in{σ,σ}.It follows that the distribution of a G-Brownian motion is the same as the distribution of a linear combination composed of a standard Brownian motion and an Ocone martingale.Finally,using this decomposition theorem,we give a new proof of the central limit theorem under the BU model in Chapter 4,and get a rough description of the G-normal distribution.In Chapter 6,we mainly study the complete convergence of an array of random variables under general sublinear expectations.In this chapter,we abandon the con-dition of time consistency and consider general sublinear expectations(coherent risk measures).We first give a definition of widely negatively dependence of random vari-ables,and obtain an exponential inequality of a sequence of widely negatively dependent random variables.Then this exponential inequality is used to give three different forms of complete convergence of an array of widely negatively dependent random variables.Finally,as an application,we use the results obtained to prove the complete convergence of an independent and identically distributed array of random variables,and obtain a strong law of large numbers of an independent and identically distributed array of random variables by the Borel-Cantelli lemma.In Chapter 7,we summarize the main content and innovation points of this paper,and look forward to the next stage of research work. |