The research and development of support vector machine(SVM)problem mainly focus on the optimization of objective function and the solution of objective function.For the optimization of objective function of support vector machine,a new loss function is proposed because the traditional hinge loss is in the data set with noise item,which is influenced by the noise and affects the classification effect.The accuracy of the classification can be improved by making the loss of the noise item far away from the hyperplane of the classification a very small value.At the same time,an offset parameter is added,which makes support vector machine better solve all kinds of noisy data set processing problem.At the same time,the objective function of support vector machine is optimized based on entropy weighting in unbalanced data set,and the support vector machine problem based on entropy weighting and loss function optimization is combined in the solving process.In order to achieve better classification effect and convergence rate.Because the stochastic gradient descent algorithm(SGD)has a faster classification speed for large-scale data sets,a stochastic gradient descent algorithm(SGD)is applied to solve the objective function of support vector machine(SVM)to solve the support vector machine(SVM)that changes the loss function. |