Font Size: a A A

Multi-view Generalized Eigenvalue Proximal Support Vector Machines

Posted on:2017-05-07Degree:MasterType:Thesis
Country:ChinaCandidate:C DongFull Text:PDF
GTID:2308330485469068Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
With the continuous development of multimedia technology, multi-view data is very common in practice. In recent years, a great many methods which consider learning with multi-view data have been proposed, therefore multi-view learning becomes a much active research direction. Compared with single-view learning methods, multi-view learning methods can improve the learning performance by considering the feature sets from distinct views. Generalized eigenvalue proximal support vector machine is a simple and effective binary classification method in which each hyperplane is closest to one of the two classes and as far as possible from the other class. It solves a pair of generalized eigenvalue problems to obtain two nonparallel hyperplanes.This paper mainly discusses the multi-view learning with generalized eigenvalue proximal support vector machines, and proposes three multi-view learning methods. First, we propose multi-view generalized eigenvalue proximal support vector machine (MvGESVM) which effectively combines two views by introducing a multi-view co-regularization term to maximize the consensus on distinct views, and skillfully transforms a complicate optimization problem to a simple generalized eigenvalue problem. Then, we propose multi-view eigenvalue proximal support vector machine (MvESVM) which uses the minus instead of ratio in MvGESVM to measure the differences of the distances between the two classes and the hyperplane leading to a simpler eigenvalue problem. Finally, we propose multi-view eigenvalue proximal support vector machine via gradient descent (MvGDSVM) which introduces the same multi-view co-regularization term in MvGESVM to combine two single-view eigenvalue proximal support vector machines and solves the optimization problems by gradient descent. Linear MvGESVM, linear MvESVM and linear MvGDSVM are generalized to the nonlinear case by the kernel trick.In order to verify the effectiveness of our proposed approaches, we evaluate the classification performance on multiple UCI data sets. Experimental results show the effectiveness of our approaches.
Keywords/Search Tags:Multi-view learning, Generalized eigenvalue proximal support vector machine, Co-regularization, Gradient descent
PDF Full Text Request
Related items