Font Size: a A A

Efficient learning and inference in rich statistical representations

Posted on:2011-08-18Degree:Ph.DType:Dissertation
University:University of WashingtonCandidate:Lowd, DanielFull Text:PDF
GTID:1448390002964745Subject:Artificial Intelligence
Abstract/Summary:
Rich statistical representations such as Markov logic networks are essential for solving hard problems in artificial intelligence and machine learning. However, the increased complexity of learning and inference often limits their effectiveness in practice. In this dissertation, we make several contributions towards richer representations and the algorithms to support them. We introduce recursive Markov logic, a "deep" generalization of Markov logic that introduces uncertainty into every level of a first-order knowledge base. We also develop improved weight learning algorithms for Markov logic, leading to more accurate models in less time. Finally, we use arithmetic circuits to address the problem of inference in graphical models in two ways. First, we present an algorithm that learns Bayesian networks with fast inference by using inference complexity as a learning bias. Second, we show how to use arithmetic circuits in an extremely flexible form of variational inference.
Keywords/Search Tags:Inference, Markov logic
Related items