Font Size: a A A

A Study Of Fisher Discriminant Analysis Based On L1 Norm

Posted on:2018-03-09Degree:MasterType:Thesis
Country:ChinaCandidate:J L YuFull Text:PDF
GTID:2348330533965254Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Pattern recognition aims to use a computer to simulate or to implement human learning and identification based on the characteristics or attributes of the study object.Feature extraction plays an important role in pattern recognition,which can effectively alleviate the problem of “curse of dimensionality” that often appeared in the filed of pattern recognition.Fisher linear discriminant analysis(FLDA)as a classical method of feature extraction has been widely applied in many aspects such as biometrics,image retrieval and text classification,etc.By maximizing of the Fisher criterion,FLDA seeks a subspace to achive better separability,that is the inter-class scatter of the data in the subspace is the largest,and the intra-class scatter is the smallest.However,the distance in the Fisher criterion is a measure of the L2 norm metric,which is sensitive to outliers and is lack of robustness.In recent years,the discriminant analysis based on robust distance such as L1 norm has attracted some researchers’ attention.Fisher linear discriminant analysis algorithm based on L1norm(FLDAL1),which not only improves the robustness of the FLDA,but also avoids the rank limitation.However,solving FLDAL1 is challenging due to its nonconvexity.Based on the framework of concave-convex procedure(CCCP),on one hand,we propose a new algorithm of FLDAL1 and verify that FLDA with L1 norm is more robust than L2 norm in feature extraction with some examples.On the other hand,we extend FLDAL1 to nonlinear kernel case,so obtain a new model and algorithm for the kernel Fisher discriminant analysis based on L1 norm(KFDAL1).The expriments in visualization and classification accuracy show that KFDAL1 is better than KFDA and FLDA after feature extraction on some artificial datas,some UCI datas and some image datas.Overall,it is better for Fisher discriminant analysis to use L1 norm instead of L2 norm to overcome the problem of rank limitation and improve robustness in feature extraction.
Keywords/Search Tags:curse of dimensionality, feature extraction, Fisher linear discriminant analysis, L1 norm, robustness, concave-convex procedure, kernel method
PDF Full Text Request
Related items