Font Size: a A A

Fast inference algorithms in dependency networks

Posted on:2012-02-15Degree:M.SType:Thesis
University:University of OregonCandidate:Shamaei, ArashFull Text:PDF
GTID:2468390011966241Subject:Artificial Intelligence
Abstract/Summary:
Dependency networks, a compelling alternative to Bayesian networks, are a directed and acyclic probabilistic graphical model for representing joint probability distributions over sets of variables. Traditional Gibbs sampling is often slow due to the nature of sampling based approaches. This thesis proposes the mean field update equation as an alternative inference algorithm. Experimental results in 12 real-world datasets show that the mean field inference in dependency networks offers accuracy similar to that of the Gibbs sampling but with orders of magnitude improvements in speed. Compared to Bayesian networks learned on the same data, dependency networks offer higher accuracy with greater amounts of evidence. Furthermore, mean field inference is consistently more accurate in dependency networks than in Bayesian networks learned on the same data. In addition to the real-world datasets used throughout this thesis, a discussion of applying the mean field update equation to relational dependency networks learned on relational databases is provided.
Keywords/Search Tags:Dependency networks, Mean field update equation, Inference
Related items