Font Size: a A A

Mixtures Of Gaussian Processes Based Classification Model

Posted on:2018-11-09Degree:MasterType:Thesis
Country:ChinaCandidate:C LuoFull Text:PDF
GTID:2348330512987255Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
In the Bayesian nonparametric framework,Gaussian process is a commonly used and powerful modeling tool which has been applied for both classification and regression problems.The mixtures of Gaussian process models are proposed for improving the capacities of data modeling and reducing the computation complexities of model training.However,the proposed models are all based on the Gaussian process regression models which are commonly used for regression problems.In this dissertation,we propose a novel infinite mixtures of Gaussian processes based classification model(MGPC).The proposed model adopts the logistic function as the likelihood,which makes the output variables represent the probabilities of the data points that belong to positive or negative classes.Thus,MGPC is more appropriate for classification problems.The mixture weights obey the Dirichlet process(DP),which provides the potential of incorporating infinite mixture components for our model.As we adopt the fully Bayesian modeling method,the introducing of the logistic likelihood and the DP leads to intractable posterior distributions of latent variables.Following the framework of variational inference,we derive the updating algorithm for the approximate posterior distributions of all latent variables.For the feasibility of the updating algorithm,we employ the linear Gaussian process model which is an equivalent parametric representation of the Gaussian process and provides the conditional independent properties among the output variables.The variational expectation maximization algorithm is adopted for the optimization of the hyper parameters.The experimental results on nine real world data sets show that comparing to the other five commonly used models,MGPC obtains better classification performances.From multiple points of view,we analyse the experimental results and obtain a new finding that different from the existing consensus.For classification,MGPC is significantly better than the regression model with mixtures of GPRs,while their single model counterparts are comparable.Finally,we provide discussions about the effects of the truncation level in approximate posterior distribution for classification performances.We validate our discussions through additional experiments.The experimental results also show the potentials of the proposed model for better classification performances.
Keywords/Search Tags:Gaussian processes, mixture models, Dirichlet processes, probabilistic graphical models, variational inference
PDF Full Text Request
Related items