Font Size: a A A

A Study Of AdaBoost Extreme Learning Machine Based On Particle Swarm Optimization

Posted on:2019-05-01Degree:MasterType:Thesis
Country:ChinaCandidate:H G LiuFull Text:PDF
GTID:2428330566472821Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Although extreme learning machine owns the advantage of easy-tuning,fast training speed,good generalization performance,its performance is unstable because the random initialization of input weights and hidden layer bias may generate hidden layer output matrix with ill-condition problems.Due to fast convergence speed,high accurancy and any classification algorithm can be used as its base classifier,AdaBoost is always applied to strengthen the performance of a single learning model,in this way,adopting ELM as base classifier of AdaBoost can make up the drawback of single ELM model.Because of simple encoding,fast convergence speed,particle swarm optimization algorithm is widely used in model optimization problems.Therefore,in this paper ELM is adopted as base classifier in AdaBoost,through further considering the diversity of base classifier and the error rate of ensemble system,a swarm diversity guided standard particle swarm optimization algorithm(DGAP-MSPSO)is used to optimize the ensemble system to deeply improve generalization performance.The main work of thesis is as follows:(1)A swarm diversity guided standard particle swarm optimization algorithm(DGAP-MSPSO)is proposed.The DGAP-MSPSO algorithm improves search performance by adjusting search strategy under the guide of swarm diversity to prevent particles trapping into local optima.At the same time,the DGAP-MSPSO algorithm inherits the rotation invariant property and maintains good search ability on high dimensional problems.Compared with conventional particle swarm optimization,DGAP-MSPSO is more suitable to optimize the ensemble of ELM.The experimental results on CEC2005 benchmark functions verified the favorable search ability of DGAP-MSPSO.(2)Based on base classifier diversity and multi-class AdaBoost(multi-class AdaBoost algorithm,SAMME),an ensemble method of ELM is proposed(Diverse-SAMME-ELM).The proposed algorithm adopts Weighted ELM as base classifier to solve the problem that conventional ELM cannot be applied directly into AdaBoost due to the cost-sensitive problem in AdaBoost.By adjusting the regularization coefficient of Weighted ELM,a set of classifiers with moderate accuracy is obtained thus to avoid performance degradation problem caused by combining strong classifiers.Then,the base classifiers in Diverse-SAMME-ELM which don't meet diversity requirement may be filtered out at each iteration by introducing the concept of base classifier diversity and new base classifier will be regenerated on current sample distribution.At last,experimental results on 9 UCI data sets verifies the outstanding generalization performance of proposed algorithm.(3)Due to the long iteration time and the complex structure of Diverse-SAMME-ELM,a new DGAP-MSPSO based ensemble method is proposed(DSPSO-SAMME-ELM)in this section.DSPSO-SAMME-ELM is also an ensemble method which considers base classifier diversity like Diverse-SAMME-ELM.The difference is that DSPSO-SAMME-ELM achieves a set of ELM classifier quickly through SAMME algorithm structure without filtering out base classifiers.Then,the classifier diversity and the error rate of ensemble system is optimized by DGAP-MSPSO thus to simplify the structure of ensemble system and improve generalization ability.
Keywords/Search Tags:Ensemble learning, multi-class AdaBoost, extreme learning machine, standard particle swarm optimization, diversity
PDF Full Text Request
Related items