Font Size: a A A

Research On Stacking Classification Model Based On Adaptive Tuning

Posted on:2020-06-19Degree:MasterType:Thesis
Country:ChinaCandidate:Z H ChenFull Text:PDF
GTID:2438330599955734Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
In recent years,statistical classification has gained massive attention in machine learning.For the classification problem,the performance of machine learning algorithms extremely depends on the characteristics of different models.A possible solution is that we can use ensemble learning to integrate the composition of various learning algorithms [1].In the ensemble learning process,the individual classifiers are able to be adopted to provide additional information concerning unknown examples.It is known that this kind of solution can be used for the improvement of the overall classification from the perspective of accuracy as well as generalization.For ensemble learning,the difficulty lies in how to combine "good but different" learners and how to set hyper-parameters.This topic wants to solve the problem of ensemble learning to select learners and hyperparameters.We propose a new self-adaptive stacking ensemble model(SSEM).Firstly,using the mechanism of Stacking combined with neural network model construction,a hybrid integration model is constructed based on the correlation of parameters and the diversity evaluation method of the model(Q statistical method).Secondly,two models of model combination optimization and parameter combination optimization are constructed to improve the hybrid integration model.The global search ability of the genetic algorithm is used to optimize the two models,and the accuracy rate is set as the fitness function.Algebraic reproduction such as crossover and mutation,and stopping after reaching the fixed algebra,finally obtaining the optimal model combination and hyperparameter setting.Unlike other ensemble learning algorithms,SSEM algorithm can adaptively select the optimal model combination and parameter settings for different data sets.In order to verify the performance and applicability of SSEM,this project applies SSEM to eight different fields(emotional classification,image classification,face recognition classification,text classification,financial data classification,social classification,computer classification,life classification),and uses nine data sets to pass through seven advanced classifiers(Naive Bayes,Extreme Random Tree,Logistic,Random Forest,CART Decision Tree,and Life Classification).Adaboost and Bagging are compared under four different evaluation indexes(accuracy,recall,F1 score and Matthews correlation coefficient).Seven classifiers with excellent performance and contrast can be obtained on eight data sets.However,on Fudan set data set,the results of this algorithm are the same as those of extreme random tree and CART decision tree,but the model classifier of SSEM model self-adaptive selection is CART decision.The strategy tree further proves that the proposed algorithm can adaptively select the optimal model combination and parameter setting.It is undeniable that the current work of this topic is only the beginning of this topic.In the future,higher-level classifiers will appear,and the weight of base classifiers in the model layer will also be considered.For example,deep adaptive parameter ensemble learning is used to set more layers and assign weights to each classifier.
Keywords/Search Tags:Classification problem, ensemble learning, genetic algorithm, parameter optimization and combination optimization
PDF Full Text Request
Related items