Font Size: a A A

The Research Of Gradient Boosting-based Ensemble Scheme For Extreme Learning Machine

Posted on:2020-05-22Degree:MasterType:Thesis
Country:ChinaCandidate:W AoFull Text:PDF
GTID:2428330599954640Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Extreme Learning Machine(ELM)is a novel training scheme for single hidden layer feedforward neural networks.Unlike the traditional back-propagation based neural networks,ELM obtains an extremely fast training speed due to avoiding the iterative parameter adjustment and time-consuming weight updating.Meanwhile,the universal approximation theorem proves ELM's convergence in theory.However,the random initialization of input-layer weights and hidden-layer node biases might generate some suboptimal parameters,which influence the generalization capability and predicted robustness of ELM.Otherwise,how to choose the optimal amount of hidden-layer nodes is significant to the generalization performance of ELM.The random mapping will generate redundant or irrelevant hidden nodes if the number of hidden-layer nodes is assigned too large,which increases the risk of over-fitting phenomenon.Moreover,the ELM model with insufficient hidden-layer nodes is hard to fit the real distribution of training dataset.There are a number of ELM variants that have been proposed to alleviate the above-mentioned weakness,where ELM ensemble learning methods are the representative ones,which are essential to ELM's theoretical research and practical application.In this paper,we proposed two kinds of ELM ensemble algorithms based on the gradient boosting mechanism.As the boosting mechanism focuses on gradually reducing the training residuals at each iteration and ELM is a special multi-parameters neural network with high available capacity,especially for the classification applications.it will lead to heavy over-fitting problem if we incorporate the gradient boosting method into ELM ensemble procedure directly.In order to deal with the aforementioned difficulty,a novel gradient boosting-based ensemble scheme for extreme learning machine(abbreviated as GBELM)is proposed.Instead of combining ELM network and the gradient boosting method primitively,we design an enhanced training scheme to construct a weak ELM flow or ELM sequence where the output-layer weights of each individual ELM are learned by optimizing the regularized objective additively.More specifically,GBELM constructs a learning objective function based on the training mechanism of boosting method.In order to alleviate the over-fitting,a regularization item which controls the complexity of our ensemble model is introduced concurrently.By minimizing the regularized objective function with the second-order approximation,the derived formula aimed at calculating the output-layer weights of the new added ELM at each iteration is determined.The derived formula can be used as an optimization criterion to futher optimize the initial weight parameters since our designed objective tends to select a model employing simple functions.Accordingly,we take the initial penalty-free output-layer weights determined by the pseudo residuals as a heuristic item and then use the optimization criterion to update the heuristic item iteratively,and thus obtain the optimal output-layer weights for each weak ELM individual.As gradient boosting method constructs additive ensemble model by sequentially fitting a weak individual learner to the current pseudo-residuals of the whole training dataset at each iteration,it costs much training time and may suffer from over-fitting problem.In view of that,a minor modification named stochastic gradient boosting extreme learning machine(abbreviated as SGB-ELM)is proposed to incorporate some randomization to the procedure.At each iteration,a random sampled training subset instead of the entire training dataset is used to fit the new introduced base learner.Consequently,SGB-ELM further achieves the sample perturbations based on the existing parameter perturbations and effectively increases the diversity of our construted ensemble learning system.In comparison with several existing typical ELM ensemble methods on the benchmark datasets,GBELM and SGB-ELM obtained better generalization capability and predicted robustness in both regression and classification tasks,which demonstrated the feasibility and effectiveness of these two algorithms.
Keywords/Search Tags:Ensemble Learning, Gradient Boosting Machine, Extreme Learning Machine, Generalization Capability, Optimization Criterion
PDF Full Text Request
Related items