Font Size: a A A

Probability Of Network Reconstruction

Posted on:2011-03-05Degree:DoctorType:Dissertation
Country:ChinaCandidate:W H LiFull Text:PDF
GTID:1118330332484372Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
Explosion of knowledge and drowning in data makes it necessary to obtain useful information by data mining and make inferences on the discovered knowledge. Randomness and uncertain judgment is inherent in most real-world decision problems. Thus, people need a method (paradigm) that supports representation of quantitative measures of uncertain statements and a method for combining the measures such that reasoning and decision making under uncertainty can be automated. Probabilistic networks, also known as Bayesian networks and influence diagrams, have become an increasingly popular paradigm, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification, and data mining under uncertainty. Probabilistic networks have become one of the most promising technologies in the area of applied artificial intelligence. The representation and inferences of probabilistic networks has also become the important field in intelligent data analysis, knowledge discovery and uncertain artificial intelligence.To solve real-world applications, larger and complex models are often learned. However, solving and learning probabilistic networks typically suffers from exponential growth in number of variables modeled. Moreover, the demands vary and increase in time. Thus, it is difficult or even impossible that the models can always be effectively solved and meet the time-varying demands. In order to extend the applicability of models, and improve the performance of reasoning under uncertainty, this thesis devotes to refining and optimizing the probabilistic networks and learning new probabilistic networks from existing models without the help of data sets.The main contributions and novelties of this thesis include the following three aspects.First, this paper studies the methods to reduce the probabilistic networks. For Bayesian networks, this thesis gives two sufficient and necessary conditions to determine local graphical characterizations. And a simple algorithm is presented to learn local models free of datasets. Moreover, the thesis defines a equivalent relation between nodes, and gives a method to obtain hierarchical Bayesian networks by stepwisely aggregating the relevant nodes in Bayesian networks. For influence diagrams, this thesis gives a definition to characterize the relation between decision variables which decisions are made cooperatively and dependently, and this thesis proves that this relation is an equivalent relation. Moreover, this thesis gives a formal definition of relevant variables over each class of decision variables to capture the variables which impact the expected utility of these decision variables. Then, a method is presented to learn the local influence diagrams based on relevant decision variables and relevant variables. In this study, the local models obtain the submodels users are interested. And approaches given assure that the inferences in local models are equivalent to the inferences in global models. Hierarchical models encapsulate local information and the detail, and hierarchical models provide a hierarchical abstraction of complex problems. The reduction of probabilistic networks improves the structure of models, extends the applicability of models, and allows evidence propagate in a smaller model other than the entire model. Thus, learning models from mass data and the subjectivity of expert knowledge could be avoided. Moreover, this can provide a foundation for improving the performance of reasoning under uncertainty.Second,decomposability of probabilistic networks is discussed. For Bayesian networks, a simple extension to previous methods is introduced, which allows Bayesian networks can be decomposed losslessly into a set of smaller Bayesian networks. For influence diagrams, the definition of the solution ordering is extended among from decision variables to the sets of decision variables, and extremal sets are presented. Moreover, the global optimal strategy can be found locally on extremal sets. Then, this thesis studies relevance reasoning on influence diagrams based on extremal sets, and presents a method to decompose the influence diagrams. Extremal sets make it possible to decompose the influence diagrams which do not imply any exact solution order among decision variables. Extremal sets.extend effectively existing methods about relevance reasoning. Moreover, Decomposition not only offer a collection of smaller models, equivalent to the input models, but also provide a foundation for a lower complexity of solving algorithms and an improvement for solving algorithms.Third, this thesis concentrates on methods to combining multiple local Bayesian networks. The thesis captures two sufficient and necessary conditions to determine the graphical characterizations of global models, and gives a method of combination without loss of any information and free of datasets. When inconsistent and redundant information between multiple local models can not be avoided, the thesis gives an algorithm to learn the minimal I-map which represents the local information as more as possible. Moreover, it is proved that these methods are more efficient than the methods learning from data sets. By the proposed method, a global and more general representation can be obtained without the help of datasets. This makes the subjectivity of expert knowledge avoided effectively. And more accurate conclusions benefit from the global models. Moreover, this is a good alternative and supplement to learning probabilistic networks.
Keywords/Search Tags:Bayesian network, influence diagram, probabilistic network, local model, reduction, decomposition, combination
PDF Full Text Request
Related items