Font Size: a A A

Research On Dimensionality Reduction Methods And Its Application In Face Gender Recognition

Posted on:2014-05-17Degree:MasterType:Thesis
Country:ChinaCandidate:D ChenFull Text:PDF
GTID:2268330392472399Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
With the development of information technology,high dimensional data is neededto processed in many fields. High dimensional data brings us great challenge,such asthe curse of dimensionlity and Hughes effect. Dimensionality reduction plays animportant role for dealing with Hughes effect and the curse of dimensionality in datamining, computer vision, and machine learning. Dimensionality reduction is the processthat transforms data from a high dimensional space into a low-dimensional subspacethrough the spectral analysis on sample matrices. It reveals the intrinsic structure of thedistribution of measurements in the original high-dimensional space.In this paper, two kind of dimensionality reduction methods are focused:1)principal component analysis (PCA) and linear discriminant analysis (LDA);2)dimensionality reduction methods based on manifold learning. The concept of manifoldcomes from differential geometry. Manifold learning asumes that the samples aresamlped from manifold which was embedded in high dimensional space. In patternrecognition, the distance of manifold is corresponded to the relation of pattern. Forexample,samples which come from the same class are close to each other on themanifold, and samples which come from the different classes are far away from eachother on the manifold. Through the manifold learning projection,the distance of themanifold can be measured by Euclidean distance. In other word, the relation of patternsalso can be measured by Euclidean distance. In this paper, locality preservingprojections(LPP) and discriminative locality alignment (DLA) are emphaticallyresearched.A novel manifold learning method called cosine-based discriminative alignment(CDA)is proposed in this paper. Comparing with PCA, LDA, LPP, DLA, theproposed CDA method has four advantages as follow:1) The solution of CDA is obtained by eigenvalue decomposition, which avoidssingular matrix problem.2) The number of dimension is selected flexibility. The maximal number ofdimension is not limited by the classes.3) CDA uses all the discriminative information from both the same class and thedifferent classes.4) CDA only has one free parameter. It can be optimized easily. In order to prove the performance of the methods, these five dimensionalityreduction methods are applied to the face gender recognition experiment. Theexperimental results show that the proposed method is better than the otherdimensionality reduction methods.
Keywords/Search Tags:dimensionality reduction, manifold learning, eigenvalue decomposition, face gender recognition
PDF Full Text Request
Related items