Font Size: a A A

ICA feature extraction and support vector machine image classification

Posted on:2006-01-20Degree:Ph.DType:Thesis
University:McMaster University (Canada)Candidate:Fortuna, JeffreyFull Text:PDF
GTID:2458390005993675Subject:Engineering
Abstract/Summary:
This thesis presents a detailed examination of the use of Independent Component Analysis (ICA) for feature extraction and a support vector machine (SVM) for applications of image recognition. The performance of ICA as a feature extractor is compared against the benchmark of Principal Component Analysis (PCA). Given the intrinsic relationship between PCA and ICA, the theoretical implications of this relationship in the context of feature extraction is investigated in detail. The thesis outlines specific theoretical issues which motivate the need for a feature selection scheme with ICA when used with Euclidean distance classification. Experimental verification of the behavior of ICA with Euclidean distance classifiers is provided by pose and position measurement experiments under conditions of lighting variance and occlusion. It is shown that (provided that the features are selected in an appropriate way), ICA derived features are more discriminating than PCA. ICA's utility in object recognition under varying illumination is exemplified with databases of specular objects and faces. A new application for ICA is illustrated by using ICA derived filters for face recognition with the a multi-class support vector machine (SVM) classifier. The ICA filters function in a similar way to Laplacian of Gaussian (LoG) filters by providing a degree of lighting invariant recognition. However, they are tuned to the specific spatio-frequency and orientation characteristics of the face dataset. The application shows that the performance of the classifier is sensitive to the tuning of the filters. As such, the use of filters derived from the data by ICA provides comparable performance to LoG filters without the need for tuning. Conceived as a method to further improve the classification of PCA and ICA derived features, a novel algorithm for improving support vector machine performance by the modification of such features derived from an image database is presented. Specifically, the modification is performed iteratively by adjusting the position of the support vectors in the linear feature space which are hypothesized to be outliers. Convergence is shown to occur when there were very few support vectors to modify. A new basis for the database is then computed from linear regression on the modified features. In this way, the SVM is used to both classify the dataset and derive a set of features which result in compact classes that provide maximum margin. This provides a simple and effective way of unifying the process of feature extraction and classification. The performance of the compact class SVM is demonstrated with a series of Gaussian mixture, object and face databases. It is shown that the compact classes which result from the use of the algorithm provide a significant improvement in the generalization ability of the SVM, by dramatically increasing the margin and decreasing the number of support vectors. For the case of image classification, the technique is particularly effective (in some cases resulting in the maximum achievable margin) illustrating that image datasets can be well described by compact classes.
Keywords/Search Tags:ICA, Feature extraction, Support vector machine, Image, Compact classes, Classification, SVM, PCA
Related items