Font Size: a A A

Fully-corrective Gradient Boosting With Squared Hinge:Fast Learning Rates And Early Stopping

Posted on:2022-05-03Degree:MasterType:Thesis
Country:ChinaCandidate:M ZhangFull Text:PDF
GTID:2518306497452194Subject:robotic leanring
Abstract/Summary:PDF Full Text Request
With the rapid development of data mining,massive amounts of data are widely available in fields such as medical care,finance,and the Internet.Although these massive amounts of data make predictions more accurate and can mine more valuable information,they also bring a series of scientific challenges,such as storage bottlenecks,algorithm scalability,and interpretability.Boosting is a well-known method for improving the accuracy of weak learners in machine learning.However,its theoretical generalization guarantee is missing in literature.In this paper,we propose an efficient boosting method Fully-Corrective Gradient Boosting(short in FCG)with theoretical guarantees for binary classification.Three key ingredients of the proposed boosting method are:a)the fully-corrective greedy update,b)a differentiable squared hinge(also called truncated quadratic)loss function,and c)an efficient alternating direction method of multipliers(ADMM)solver.Compared with the traditional boosting methods,on one hand,the FCG update accelerates the numerical convergence rate,and on the other hand,the squared hinge loss inherits the robustness of the hinge loss for classification and maintains the theoretical benefits of the square loss in regression.The ADMM solver with fast convergence guarantee then provides an efficient implementation for the suggested boosting method.We conduct both theoretical analysis and numerical verification to show the out-performance of the proposed method.Theoretically,a fast learning rate of an orderO((m/logm)-1/2),is proved under certain standard assumptions,where m is the size of sample set.Numerically,a series of toy simulations and real data experiments are carried out to verify the developed theory.On the basis of FCG,we continue to propose Fully-Corrective Super Gradient Boosting(short in FCSG).This algorithm can reduce the computational burden while ensuring generalization performance.The effectiveness of the algorithm is verified by simulation experiments.Tests are performed under the same real data to verify the superiority of the FCSG algorithm.FCSG and FCG can achieve the same accuracy,but FCSG can greatly improve time efficiency.
Keywords/Search Tags:Boosting, learning theory, squared hinge, fully-corrective greedy algorithms, early stopping
PDF Full Text Request
Related items