Font Size: a A A

Bayesian nonparametric methods for non-exchangeable data

Posted on:2014-07-16Degree:Ph.DType:Thesis
University:Dartmouth CollegeCandidate:Foti, Nicholas JFull Text:PDF
GTID:2450390008460300Subject:Computer Science
Abstract/Summary:
Bayesian nonparametric methods have become increasingly popular in machine learning for their ability to allow the data to determine model complexity. In particular, Bayesian nonparametric versions of common latent variable models can learn an effective dimension of the latent space. Examples include mixture models, latent feature models and topic models, where the number of components, features, or topics need not be specified a priori. A drawback of many of these models is that they assume the observations are exchangeable, that is, any dependencies between observations are ignored. This thesis contributes general methods to incorporate covariates into Bayesian nonparametric models and inference algorithms to learn with these models. First, we will present a flexible class of dependent Bayesian nonparametric priors to induce covariate-dependence into a variety of latent variable models used in machine learning. The proposed framework has nice analytic properties and admits a simple inference algorithm. We show how the framework can be used to construct a covariate-dependent latent feature model and a time-varying topic model. Second, we describe the first general purpose inference algorithm for a large family of dependent mixture models. Using the idea of slice-sampling, the algorithm is truncation-free and fast, showing that inference can be done efficiently despite the added complexity that covariate-dependence entails. Last, we construct a Bayesian nonparametric framework to couple multiple related latent variable models and apply the framework to learning from multiple views of data.
Keywords/Search Tags:Nonparametric, Latent variable models, Methods, Framework
Related items