Font Size: a A A

Multi-View Gaussian Processes

Posted on:2019-03-21Degree:MasterType:Thesis
Country:ChinaCandidate:Q Y LiuFull Text:PDF
GTID:2428330566960647Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Gaussian processes(GPs)have shown their popularity as effective tools in extensive applications of machine learning,especially on some single-view issues.With the development of the big data,the forms of data are becoming more and more diversified,resulting in frequent occurrence of multi-view data in many tasks of machine learning.However,standard GPs only pay attention to the scenario where data from a single view are provided.There are very few applications of GPs in the scenario of multi-view learning.In this paper,our motivation is to study the multi-view Gaussian processes models,including multi-view regularized Gaussian processes,multi-view deep Gaussian processes and sparse multi-view Gaussian processes.In this paper,we firstly employed the thought of the co-regularization and proposed multi-view regularized Gaussian processes(MvGPs),which is a straightforward extension of GPs to multi-view learning with a convenient implementation.Unlike existing methods,it combines multiple views by regularizing marginal likelihood with the consistency between the posterior distributions of latent functions from different views.In order to make a more reasonable use of the consistency of multi-view data,we presented a general point selection scheme for multi-view learning,which constrains the assumption of multi-view consistency on selective and important points.Based on this scheme,we improved the model.Secondly,there are many complex data in the real world,which may need to be handled by hierarchical models.In order to handle these data,we applied the deep Gaussian processes(DGPs)to the multi-view scenario and developed the multi-view deep Gaussian processes(MvDGPs).In contrast with the DGPs,MvDGPs could model data of different views with different depth,which results in better characterizations of the discrepancies among different views.Finally,in order to apply multi-view Gaussian processes to scalable data,we also presented two kinds of sparse multi-view approaches for multi-view GPs,the maximum informative vector machine(m IVM)and the alternative manifold-preserving(aMP).Inspired by the information theory,the m IVM attempts to obtain the maximum amount of information from all the views with the minimum number of data points,while the aMP,motivated by the manifold preserving principle,leverages an alternative selection strategy to make use of data from all the views for preserving the high space connectivity.Experimental results on multiple real world data sets have verified the effectiveness of all the proposed models.
Keywords/Search Tags:Gaussian processes, deep Gaussian processes, multi-view learning, sparse approximation, variational inference
PDF Full Text Request
Related items