Font Size: a A A

Analysis And Research On Improving Linear Discriminant Analysis

Posted on:2013-09-23Degree:MasterType:Thesis
Country:ChinaCandidate:C Y MiaoFull Text:PDF
GTID:2248330395479633Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Currently, linear discriminant analysis and data dimensionality reduction in patternrecognition has important application and research. This paper mainly focuses on lineardiscriminant analysis to improve the algorithm and its application to face recognition and theequivalence of the improved algorithm. In many areas data dimensionality reduction has beenwidely used. Data dimensionality reduction can contribut the best classification results of thehigh-dimensional data. The data dimensionality reduction refers to linear or nonlinearmapping sample data from high-dimensional space mapped to the low-dimensional space,making the mapping data in low-dimensional space as much as possible without loss oforiginal information.In order to avoid"the curse of dimensionality”,we combined datadimension in the research and application. The research is done as below.1. Face recognition is a hot research topic in pattern recognition and artificialintelligence.Many methods have been applied to face recognition.The feature extraction is akey issue in pattern recognition research. As for face recognition, the key to identification isto better extract the effective face image feature. In the past few decades, scholars haveproposed a large number of dimensionality reduction methods and in-depth exploration andresearch. As a way to overcome the curse of dimensionality, it has an important position inthe relevant fields. Feature extraction methods such as the classic linear method, manifoldlearning method, and linear manifold learning methods were introduced and analyzed.2. With the development of science and technology, high-dimensional data is often seenin practical applications, such as face recognition, transaction data and multimedia data. Thekey to the success of high-dimensional data processing lies in the choice of dimensionalityreduction methods. In order to achieve the best effect of classification in data dimensionalityreduction, the linear discriminant analysis of the algorithm is analyzed and studied in thispaper. First, from the theoretical level, this paper has a comprehensive theoretical analysis ofthe dimensionality reduction method for the within-class scatter matrix null space and thematrix column space to reach the conclusion: under C1conditions, OLDA, ULDA, NLDA areequivalent to DLDA-ST,with all sharing the same optimal solution in the form. Then, thispaper concludes and compares the advantages and disadvantages of the expanded methods ofLDA, and predicts the development of dimension reduction.3. This paper presents an improved algorithm for linear discriminant analysis, themethod of achieving optimal classification of sample points based on an optimized LMNLDAAlgorithm for Face Recognition (based on optimized LMNLDA).This method aims to solvethe small sample problem and to overcome the overlap between samples in the sample space, so that the sample space has the best separableness. High-dimensional samples are projectedto the best low dimensional identifying vector space. The definition of the term scatter matrixtakes into consideration of influence of the projection direction on the data, which not onlyovercomes the data overlap between the samples, but also resolves the proplem of the edge’sinfluence on projection direction. As a result, the effect of face recognition is more accurate.This paper also compares and analyzes the optimized LMNLDA and the currentmethods including YALE、ORL and PIE in face database. The result of the experimentsverifies the effectiveness and feasibility of the improved algorithm.
Keywords/Search Tags:curse of dimensionality, face recognition, pattern recognition, feature extraction, linear discriminant analysis, equivalence
PDF Full Text Request
Related items