Font Size: a A A

Research On Some Problems Of Kernel Method In Pattern Recognition

Posted on:2007-05-08Degree:DoctorType:Dissertation
Country:ChinaCandidate:H P WanFull Text:PDF
GTID:1118360185467788Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Kernel method is a powerful machine learning method developed recently. It builds on the statistical learning theory. Statistics have been playing an importorn role in machine learning and pattern recognition, but most of results of classical statistics derive from asymptotic assumption, i.e. statistical properties which will present when the observation samples tends to infinity. It is so rigorous a constraint that hard to meet in practice. Although apparently a drawback, many algorithms in machine learning still adopt this presupposition.As a contrast, statistical learning theory studies machine learning problem in the case of finite sample observations. After several decades of development, it has emerged with sound theory foundation. It regards machine learning problem as a general problem which performs function evaluation with finite sample observations. It synthetically studies when the principle of empirical risk minimization will hold, the relationship between empirical risk and expectation risk and how to seeking new algorithms within this framework.The representative algorithm in statistical learning theory is kernel method or socalled support vectors machine method, which can be applied to many pattern analysis problems such as pattern classification, clustering, regression analysis and novelty detection. In this paper I will discuss issues about kernel methods in pattern classification, clustering and several orther key problems in kernel theory and its application.Classical kernel method mainly defines on vectors in Euclidean space. A method to generalize kernel method to sets of vectors from the view of space rotation is proposed, which is able to present and process more complicated data object types. Compared with similar schemes, neither it needs to impose a hypothetical probability density function on data in advance, nor complicated numerical integral computation involved in order to apply kernel method. In a face recognition benchmark experiment, it achives positive result and outperforms similar work.Eigenface and Fisherface are two popular methods in face recognition. They both are linear ones and aim to explore global structure in image space. Learning nonlinear structure is beyongd their scopes. The significance of local structure in face recognition is discussed. Deeming that local structure can be learned and represented by means of linear or approximately linear pattern, global nonlinear structure can be learned and represented by way of segment by segment linearization, a method learning global nonlinear structure combined with LLE algorithm is proposed. In a face recognition benchmark experiment, it exceeds both Eigenface and Fisherface algorithm.LPP is a data visualization and dimensionality reduction algorithm which based on spectral graph theory. For spectral graph presents a tight connection with differential manifold and geodesic distance is more accuarate than Euclidean distance in disclosing similarities between...
Keywords/Search Tags:Pattern recognition, statistical learning theory, classification, clustering, spectral graph, local neighborhood relationship, subspace, secure multi-party computation, oblivious transfer
PDF Full Text Request
Related items