Dimensionality reduction is one of the most important preprocessing steps in high-dimensional data analysis of pattern recognition. In today's information period, it is inevitable to encounter large amount of high-dimensional data in different research areas, such as face detection and recognition, text categorization, microarray data gene selection and so on. In practical applications, in order to avoid the problem called the curse of dimensionality, high-dimensional data is always transformed into low-dimensional data based on some property. This is just the procedure of dimensionality reduction. In a word, dimensionality reduction aims to embed high-dimensional data samples into a relative low-dimensional space while preserving the data's intrinsic information mostly. And dimensionality reduction techniques can be categorized into two aspects, feature extraction and feature selection. After some appropriate process of dimensionality reduction, further tasks, e.g. visualization, classification, can be implemented conveniently in the low-dimensional space.This paper aims to explore novel supervised dimensionality reduction algorithms and both a SVM based feature selection method and a couple of supervised locality preserving criteria based feature reduction methods are proposed. In this paper, the principle of dimensionality reduction is briefly introduced and some current techniques are reviewed, such as Principal Component Analysis, Fisher Linear Discriminat Analysis, recent manifold based feature extraction methods and some other related feature selection methods. The emphasis of this paper is as follows: Firstly, an improved feature selection method based on SVM is proposed and the principle and way of how to use this method for feature selection are elaborated. Secondly, a couple of dimensionality reduction methods based on supervised locality preserving criteria are proposed and also the principle and way of how to use these methods for feature reduction are elaborated. Aiming at the SVM based feature selection method, the proposed algorithm tries to utilize the maximum margin property sufficiently and then rank and select features using support vectors and the kernel function. The paper also illustrates the advantage and disadvantage of the unsupervised locality preserving criteria in details and then proposes the supervised locality preserving criteria, which tries to not only preserve the intra-class local structure but also maximize class separability. The criteria is applied to both feature extraction and feature selection. At last, MATLA is used to implement all the algorithms and experiment results on many datasets demonstrate the effectiveness, feasibility and advantage of the proposed methods over some present dimensionality methods. |