As a well-developed machine learning algorithm,Support Vector Machine(SVM)has been widely used in various fields,but the selection of kernel parameter has always been the core problem affecting the effect of SVM algorithm.And the kernel learning criterion independent of SVM is extensively favored in recent years because of its low computational complexity and good performance.In this paper,we introduce two typical kernel learning criteria : the Centered Kernel Target Alignment(CKTA)and the Cosine Similarity of Kernel(CSK),then improve them respectively.In view of the fact that CKTA treats normal sample points and outliers equally,which may affect the separability of samples in the feature space.Therefore,the ideal kernel matrix is reconstructed by the proportion of samples belonging to the same class in the nearest neighbor matrix corresponding to the training set,and a new kernel learning method named New Local Centered Kernel Target Alignment(NLCKTA)is obtained.Based on maximizing the value of NLCKTA,the corresponding multiple kernel learning method is derived.To solve the problem that the value of CSK does not have the non negativity in theory and is not easy to be extended to multiple kernel learning,a new kernel learning method called New Cosine Similarity of Kernel(NCSK)is obtained by dividing samples' average cosine similarity in the same class into positive and negative ones,the multiple kernel learning method is given by using the relative weight of NCSK value corresponding to the basic kernel function.The experimental results show that NLCKTA has a significant improvement in single and multiple kernel learning ability compared with CKTA,NCSK keeps the single kernel learning ability of CSK,and is generally better than CKTA in multiple kernel learning performance. |