Font Size: a A A

Nonparametric Margin Maximization Criterion And Its Applications

Posted on:2007-05-29Degree:DoctorType:Dissertation
Country:ChinaCandidate:X P QiuFull Text:PDF
GTID:1118360212984376Subject:Computer applications
Abstract/Summary:PDF Full Text Request
Since real pattern recognition problems often have very high dimensionality and the limited number of samples, and have to face with the "curse of dimensionality" problem, such face recognition and text categorization. So feature extraction has a important role in pattern recognition and machine learning.Linear Discriminant Analysis (LDA) is a popular feature extraction technique in statistical pattern recognition. However, it often suffers from the small sample size problem when dealing with the high dimensional data. Moreover, while LDA is guaranteed to find the best directions when each class has a Gaussian density with a common covariance matrix, it can fail if the class densities are more general.In this paper, a new nonparametric feature extraction method, nonparametric margin maximization criterion (NMMC), is proposed from the point of view of the nearest neighbor classification. NMMC finds the important discriminant directions without assuming the class densities belong to any particular parametric family. It does not depend on the nonsingularity of the within-class scatter matrix either.We also apply NMMC to face recognition. The results demonstrate that NMMC outperforms the existing variant LDA methods and the other state-of-art face recognition approaches on three datasets from ATT and FERET face databases.We also propose a linear feature extraction method, info-margin maximization (Info-Margin), from information theoretic view, which aims to a low generalization error by maximizing the information divergence between the distributions which belong to different classes while minimizing the entropy of distribution in a single class. We estimate the density in single class with Gaussian kernel Parzen window and give an efficient algorithm by quadratic entropy and divergence measure, which can avoid to use histogram-based methods for integrating density functions and converges fast. Experimental results show that our method outperforms thetraditional feature extraction methods for classification and data visualization.
Keywords/Search Tags:Feature Extraction, Linear Discriminant Analysis, Face Recognition, Nonparametric Margin Maximization Criterion, Stepwise Dimensionality Reduction, Info-Margin Maximization, Nearest Neighbor Classification
PDF Full Text Request
Related items