Font Size: a A A

Analysis Of The Capacity Region Of Multi-source Multi-sink Channel

Posted on:2011-08-28Degree:MasterType:Thesis
Country:ChinaCandidate:Q ChenFull Text:PDF
GTID:2178360308452489Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
In this paper, we analyze the capacity region of multi-source multi-sink orthogo-nal channel by its equivalent problem, the characterization of entropy function regionΓ?.Γn? of n discrete random variables is a region in entropy space Hn determined byinformation inequalities. There are two classes of information inequalities. One is thetraditional Shannon-type information inequalities, which refer to the non-negativityof information measures such as joint entropy, mutual information etc. The other isthe so-called non-Shannon-type information inequalities represented by Zhang-Yeunginequality. Since there exist infinite many information inequalities, the full character-ization ofΓn? is a great challenge to the information theorists. From its definition in1997 to now, it remains an open problem.There are dualities betweenΓ? and capacity region of multi-source multi-sinkorthogonal channel. By utilizing network coding, the capacity region can be charac-terized by an expression which contain finite entropy function regionΓ??, the subsetofΓ?. Thus, network coding will be included in our study. Capacity region of sin-gle source network (orthogonal channel) can be achieved by linear network coding.However, this result can not be generalized to the multi-source case. So we give theexpression that mentioned before. We prove that rate out of the region determined bythe expression can not be achieved and rate in the region can be achieved respectively.Since the expression containsΓ??, the problem of determination capacity region of themulti-source multi-sink orthogonal channel has not yet been completely solved.Since the equivalence between the characterization of capacity region of themulti-source multi-sink orthogonal channel and characterization ofΓ?, and the dif-ficulty of the full characterization ofΓ?, we will try to solve this problem by finding the inner bounds ofΓ? and solving of its simplified subproblems.Studying the relationship between group theory and entropy functions is one ofthese methods. Entropy functions defined by groups are called group characterizationentropy functions. Their regionΓG? is a subset ofΓn?. Since its convex closure equalsto the closure ofΓn?,Γn?, it re?ect most characters ofΓ?n.Γ?G's subsetsΓa?b, the regionof abelian group characterizable entropy functions andΓL?, the region of linear groupcharacterizable entropy functions, both have some nice properties.Γn? and its subsetsΓG?,Γa?b,Γ?L all have dualities with the inner bound of capacity region of multi-sourcemulti sink network code(orthogonal channel).Matroid theory is another tool that can be utilized to study the capacity regionof multi-source multi sink network code(orthogonal channel) and entropy functions.Matroid theory is an algebra that studies the relations between elements of a set. Bymatroid theory, we can study the relations between each independent source informa-tion ?ow and then find the capacity region finally. In this paper, we mainly introduceda method that construct networks from a given matroid. The networks constructed bythis method can re?ect the relations that in matroids. By the networks constructedfrom Fano and anti-Fano matroids, and their combined network, we can prove theinsufficiency of linear network codes to achieve the capacity region of multi-sourcemulti-sink orthogonal channel.Average entropy functions are the simplified version of entropy functions. Weprove thatΦn?, the average entropy function region can be fully characterized byShannon-type information inequalities. Average entropy functions are probable towork in the determination of capacity region of multi-source multi-sink random net-works that have symmetrical structure.In conclusion, the problem of characterization of the capacity region of multi-source multi-sink orthogonal channel and its equivalent problem, the characterizationofΓ? are both extremely challenging problems. It remains a long way to go for infor-mation theorists to solve them completely.
Keywords/Search Tags:capacity region, entropy functions, network coding, group characterizable entropy functions, matroid, average entropy functions
PDF Full Text Request
Related items