Kolmogorov-Smirnov(KS)statistic has been widely used in many areas to evaluate the performance of binary classification.However,almost no classification algorithm tries to optimize it directly at the training stage due to the computational and theoretical challenges brought by the special form of KS.In this paper,we propose a novel Kolmogorov-Smirnov neuron Network(KSNet)using KS as the optimization objective.The difficulty of non-smoothness of the empirical KS is overcame by introducing a smooth nonconvex surrogate function.KSNet brings great potential to improve the KS in test data especially for imbalanced data and it shows inspiring robustness to data noise.Theoretically,we establish the non-asymptotic excess risk bound of KSNet with a ReLU activated feedforward neural network and show its Bayes-risk consistency.Further we alleviate the curse of dimensionality by assuming that the input data is supported on a compact low-dimensional manifold.Experiments on a variety of real datasets confirm the advantages of KSNet over a lot of existing methods. |