| We study the kernel learning problems with ramp loss,a non-convex but noise-resistant loss function.In this work,we justify the validity of ramp loss,moreover,under classical kernel learning framework,we show that the generalization bound for empirical ramp risk minimizer is similar to that of convex surrogate losses,which implies kernel learning with such loss function is not only noise resistant but,more important,statistically consistent.For adapting to large-scale and real-time scenarios,we extend ramp loss to multiclass ramp loss and introduce NOLCA,a heuristic online algorithm based on online gradient descent framework,to solve this learning problem.Empirically,with comparable empirical performances to batch learning methods,our method spends much less time on training phase.Finally,to meet the requirement of noisy and large-scale classification scenarios,we develop the PYTHON package PYNOLCA,which is efficient and easy-to-use. |