Pattern recognition abased on statistical theory is an important study field in artificial intelligence. At present, pattern recognition is deeply studied, and some relevant technology has been successfully applied in many fields. However, Pattern recognition still confronts many challenges, and many issues need to be more deeply explored and further study. Feature dimension reduction and kernel method are two important topics of it. Motivated by the above challenges, several issues are addressed in this study, which mainly involves the following four parts.In the first part which is composed of Chapter 2, aiming at the drawback of supervised locality preserving projection (SLPP), which encounters the so-called"small sample size"problem in the high-dimensional and small sample size case, a new algorithm called generalized supervised locality preserving projection (GSLPP) is proposed. The relationship between SLPP and GSLPP is theoretically analyzed. However, in the small sample size case GSLPP can be solved equivalently in lower-dimensionality space.In the second part which is composed of Chapter 3 and Chapter 4, how to improve the performance of the minimum class variance support vector machines (MCVSVM) algorithm is discussed. MCVSVM, contrast to the traditional support vector machines (SVM), utilizes effectively the distribution of the classes but has not taken the underlying geometric structure into full consideration. Therefore, in Chapter 3, a so-called minimum class locality preserving variance support vector machines (MCLPVSVM) is presented by introducing the basic theories of the locality preserving projections (LPP) into MCVSVM. This method inherits the characteristics of the traditional SVM and MCVSVM, fully considers the geometric structure between the samples, and shows better learning performance. On the other hand, in MCVSVM information only in the non-null space of the within-class scatter matrix is utilized in small sample size case. In order to improve farther the classification performance, in Chapter 4, the null space classifier (NSC) which is rooted in the null space is first presented and then a novel ensemble classifier (EC) is proposed by assembling the MCVSVM and the NSC. Be different form the MCVSVM and the NSC, the EC takes into consideration information both in the non-null space and in the null space.In the third part which is composed of Chapter 5, based on the basic idea that the support vector regression (SVR) can be regarded as a classification problem in the dual space, MCVSVM is extended to deal with the regression task, and then a novel regression algorithm called minimum variance support vector regression (SVR) is proposed. This method inherits the characteristics of the MCVSVM algorithm, such as gives a more robust solution and gets better generalization performance, and can be transformed into the traditional SVR. In the fourth part which is composed of Chapter 6, the properties of support vector data description (SVDD) solutions are explored. Most of previous research efforts on SVDD, which is one of the excellent and applied widely kernel methods, were directed toward efficient implementations and practical applications. However, very few research attempts have been directed toward studying the properties of SVDD solutions. In Chapter 6, the primal optimization of the SVDD is first transformed into a convex constrained optimization problem, and then the uniqueness of the centre of ball is proved and the non-uniqueness of the radius is investigated. In this paper, we investigate also the property of the centre and radius from the perspective of the dual optimization problem, and suggest a method to calculate the radius.As a whole, this study addresses feature dimension reduction method in the first part, and does kernel method from the second to the fourth part. |