| Since F.Rosenblatt's presenting the model of proprioceptors, machine learning has made great progress with the developments of computer technology. After several setbacks, statistical theory was induced into this area in 90s'. A new method called SVM(support vector machine) was found. It is a method based on the minimization of risk. There are two ways to the research on support vector machine. These are SVC(support vector machine for classification) and SVR(support vector machine for regression). The parameter choice is an important research way to SVM.The choice of kernel is very important. Most of currnent kernel functions are positive. After inducing non-positive kernel funtion, the SVM model can not work. Because the SVM model can not transform the SVR problem to a convex quadratic problem.Now the researches on SVC have been great developed, which show us a lot of methods for selection of parameters. On the contrary, SVR has been little developed. The geometrical framework of SVR show us the relationship between SVR and SVC. It tells that a SVR problem can be seen as a SVC problem in this framework. This make us to do researches on SVR with the methods of SVC. But the relationship may result the non-positive kernel function.This article do some work to solve the two problems above. To the first problem, this article represents a theory called similar SVR model in this article based on the SVR model and RKKS machine learning theory. In the experiments, this model works well with both positive kernel funciton and non-positive kernel function. To the second problem, this articles solve it in two ways. First, make sure the kernel function is always positive with the method of mapping transformation, when kernel function is transformed from SVC to SVR. Second, deal with the non-positive SVR problem with similar SVR model. The experiments show us that both the ways can work well. And the experiments show us again that the similar SVR model can solve the SVR problem with the non-positive kernel function well. |