Feature extraction is one of the elementary problems in the area of pattern recognition. It is the key to the classifier problems such as face identification and handwritten character recognition. The kernel projection analysis, including the kernel principal component analysis (KPCA) and kernel Fisher discriminant analysis (KFDA), is an efficient nonlinear feature extraction method proposed by Scholkopf and Mike et al recently. This paper not only extends the kernel projection analysis theoretically, but also analyzes the associated algorithms on kernel projection analysis. The proposed algorithms can be successfully applied on the face recognition and handwritten character recognition.Foley-Sammon linear discriminant analysis (FSDA) is an efficient linear feature extraction method. Based on FSDA and kernel Fisher discriminant analysis, a kernel Foley-Sammon discriminant analysis (KFSDA) is proposed. Firstly two equivalent models of KFSDA are built and then the relationship between them is analyzed. Lastly the detailed implementation and correspording provement of KFSDA models are given. It can be got that KFSDA can preserve the advantage of FSDA, that the redundant information among sample features can be reduced very well. Moreover KFSDA can effectively extract the sample nonlinear features. Obviously KFSDA is the further extension to FSDA. The experiments based on Concordia University CENPARMI-a database of handwritten Arabic numerals, prove the effectiveness of KFSDA.Based on the kernel methods, this paper extends nonlinearly the generalized optimal set of discriminant vector and proposes a new concept of generalized optimal set of kernel discriminant vector (GOSKDV). The corresponding model of this concept is built and the detailed implemention is given. The analysis shows that the features that are extracted based on GOSKDV have the maximum separability on the whole and nonlinear characteristics. Obviously the GOSKDV is the further extension to the generalized optimal set of discriminant vector. The experimental results based on the ORL face database show that the proposed method is valid.Although the kernel Fisher discriminant analysis (KFDA) has already become one of the most efficient nonlinear feature extraction methods, it always faces the singularity problem. So far the existing algorithms of KFDA have not solved this problem very well. This paper proposed an optimal kernel Fisher discriminantanalysis (OKFDA), which solving the computation of the optimal kernel discriminant vectors in the singular cases. It divides the optimal kernel discriminant vectors into two kinds. First, in the null space of kernel within-class scatter matrix, the normal orthogonal vector group which maximizes the kernel between-class scatter is selected, so the first class of optimal kernel discriminant vectors is got. Then in non-null space of kernel within-class scatter matrix, the normal vector group that maximizes the kernel discriminant criterion is selected, which is the second class of optimal kernel discriminant vectors. Therefore the optimal kernel discriminant vector set is got, which extracts the optimal nonlinear discriminant features (two classes altogether) of original samples. The experimental results based on the sub-set of FERET face database show the effectiveness of OKFDA.Although the independent component analysis (ICA) plays an important role in the field of face recognition due to its good properties, as we all know, the feature extraction of face image even based on the fast ICA algorithm (FastICA) has disadvantages, such as huge-computation and time-consuming. So a new automatic face recognition method is proposed in this paper. Firstly the kernel principal component analysis (KPCA) is used to reduce the dimension of original face image, thus the principal component feature of face image is given prominence and the high order statistical information about the nonlinear relationships among the pixels of face image is considered. Finally, the algorithm of FastICA is used to extract the principal indepe... |