Font Size: a A A

Kernels For Feature Extraction And Research On Nonlinear Multiple Kernel Learning

Posted on:2011-02-17Degree:MasterType:Thesis
Country:ChinaCandidate:J B LiFull Text:PDF
GTID:2178360302964553Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Kernel methods are considered very effective and powerful tools for nonlinear classification. This approach to pattern classification first embeds the data in a suitable feature space, and then uses algorithms based on linear algebra, geometry and statistics to discover patterns in the embedded data. The algorithms are implemented in such a way that the coordinates of the embedded points are not needed, only their pairwise inner products. The pairwise inner products can be computed efficiently directly from he original data items using a kernel function. Kernel methods can be used in feature extraction and classification.Feature extraction is very important for pattern classification. In the growing field of brain computer interface, extraction features of EEG signals are of great importance. Because some signal distributions may have a latent nonlinear structure. The nonlinear Therefore, developing nonlinear spatial filters is the task we should do.Owing to the lack of an explicit discriminative objective function, the significance and potential of the traditional spatial filters can not be understood intuitively. We first present an extreme energy difference (EED) method based on covariances of signals, which has a desirable explicit discriminative objective function. For its linearity, we combine it with kernel methods and proposed a kernel extreme energy difference (KEED).Classifier designing is essential in pattern classification, which to some extent affects classification performance. Support vector machines (SVMs) have been successfully applied to classification problems. However, they are usually based on a single kernel and the kernel is chosen before learning, which makes them lack flexibility. In many applications it is desirable to use multiple kernels. Multiple kernel learning (MKL) enables us to optimize over linear combination of kernels. Despite its success, MKL neglects useful information generated from interaction of different kernels. In this paper, we propose SVMs based on nonlinear combination of multiple kernels (NCMK) which not only better meet the needs of practical applications but also overcome the drawback of MKL.
Keywords/Search Tags:brain-computer interface (BCI), EEG signal classification, feature extraction, kernel machine, Multi-kernel learning, Support vector machines (SVMs), Semi-definite programming (SDP), Pattern classification
PDF Full Text Request
Related items