| As an important technique in the field of data mining, classification technique is being widely applied in many areas. There are many classification techniques at present, such as decision trees, support vector machines, etc. In order to achieve better prediction results, people often produce many classifier by training set in the traditional method, and choose the best one in performance as final model, after testing the classification performance of each base classification with test set. However, the prediction performance of a single classification model is limited, and each classifier has its appropriate classification of the data area, which reflects instability on the classification performance of single classifier. Through the combination of multiple classifiers, we can effectively improve the classification performance, but also can guarantee the stability of prediction results; in this case, ensemble technique for multiple classifiers came into being.Ensemble technique can improve classification performance through the combination of several single classifiers. The single classifier is called the base classifier, which mainly consists of two parts:the method of base classifier generation and the method of combination of base classifiers. As a typical method in ensemble learning, Boosting algorithm produces base classifier through the maintenance of a weight distribution on training set. So that base classifiers pay attention for different instances every iteration. Bagging method is also an important method in ensemble technique, and it produces base classifiers by sampling with replacement for every iteration. The opportunity to be selected for each instance is completely equal in Bagging. MultiBoost method can be viewed as combining Boosting with wagging, and it is demonstrated to produce decision committees with lower error than either Boosting or wagging. It randomly sets the weights of the training instances to the Poisson distribution, and then produces base classifiers. Through analysis of these methods, this thesis proposes two methods for generating base classifiers through weighting training instances.The combination of base classifiers is another important aspect of ensemble technique for classification. While the output of base classifiers is category, voting method is always used currently. The simple vote is applied in bagging method, and the vote of each base classifier in the final decision-making is equal. The weight for voting of each base classifier in Boosting depends on its error rate, the final decision-making make use of weighted vote technique. This thesis analyzes the method of weighted voting in detail, and proposes two methods to adjust the value of weight of voting on this basis for different considerations. The experimental results show that the proposed method can improve the classification performance. |