| Support vector machine(SVM)is a supervised learning classification model and has been extensively applied into text classification,disease diagnosis,face detection and so forth.It has been widely recognized that the 0/1 loss SVM model is the most natural optimization model for SVM problem since it minimizes the number of misclassified samples.However,the 0/1 loss SVM is NP-hard since the 0/1 loss function is nonconvex and discontinuous,and up to now,its optimality condition and algorithm have not been established.By analyzing the subdifferential and proximal operator of 0/1loss function,this paper establishes the optimality condition and develops an effective algorithm of the primal model of 0/1 loss SVM.To explore the relationships between0/1 loss SVM and 0/1 approximate loss SVM,this paper proposes a new 0/1 nonconvex approximate loss SVM: truncated concave loss SVM,and establishes its optimality condition and algorithm.Furthermore,this paper gives the equivalent condition and numerical comparison between 0/1 loss SVM and truncated concave loss SVM.For the 0/1 loss SVM model,based on the explicit expressions of subdifferential and proximal operator of the 0/1 loss function,this paper introduce two types of optimality conditions of the problem: KKT and P-stationary conditions.By analyzing the relationships among a local minimizer and the above two points,this paper obtains the first-order necessary and sufficient conditions of 0/1 loss SVM model.Finally,this paper define the 0/1 support vectors based on the P-stationary condition and show that all of the 0/1support vectors are on the two support hyperplanes.For the 0/1 loss SVM model,based on the above P-stationary point and 0/1 support vectors,this paper adopt the 0/1-ADMM algorithm for solving the 0/1 loss SVM,which is proven that it can converge globally.The 0/1-ADMM algorithm introduces the working sets defined by 0/1 support vectors,which has a very low computational complexity.The paper use the P-stationary point as a termination rule in terms of guaranteeing the local optimality of a point generated by the 0/1-ADMM algorithm.Further,this paper apply 0/1-ADMM algorithm to solve some application problems.Comparing with some leading classification solvers,extensive numerical experiments demonstrate that our proposed method achieves better performance including higher prediction accuracy,fewer number of support vectors and faster computational speed.The bigger the data size is,the more evident its advantage appears.For the truncated concave loss SVM model,by using the proximal operator of the truncated concave loss function,this paper studies the first-order necessary and sufficient conditions and support vectors of the truncated concave loss SVM.Under certain conditions,this paper establishes that the truncated concave loss SVM is equivalent to 0/1 loss SVM.To compare with 0/1-ADMM algorithm,this paper adopt the φ-ADMM algorithm for solving the the truncated concave loss SVM,which is proven that it can converge globally.The numerical results demonstrate the superior performance of φ-ADMM algorithm in terms of prediction accuracy,support vectors and computational speed. |