Font Size: a A A

Weight Learning And Differential Evolutionary In Machine Learning

Posted on:2016-03-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:C R DongFull Text:PDF
GTID:1108330479993402Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Machine learning is one of the core research problems in artificial intelligence(AI) and an essential way to make computer thinking like human. Since 1950 s, a lot of machine learning algorithms have been proposed, such as decision tree, neural network, support vector machine, K-means and K nearnest neighbor algorithm. However, the data in real world usually has one or several of the following charateristics: consisting of noisy samples, incomplete data, uncertainty in data, with imbalanced class distributions, biased feature weights, and huge number of samples, etc. These issues become more serious with the dramatic development of computer techniques and communication network. Many improved algorithms are proposed to deal with imperfect data, e.g. the ensemble learning system with multiple base learners, the fuzzy reasoning system for fuzzy data, the weighted learning system, and the efficient evolutionary based learning algorithm, etc. Among these algorithms, the weighted learning method and the fuzzy system are two widely used algorithms. There are still some interesting unsolved research problems. For example, how to obtain a set of “good” weights to improve the performance of a weighted learning system? Is there any relationship between the fuzzy classifier performance and fuzzy classifier output vectors?On several selected learning models, we throughly studied the optimization model of weights and the relationship between the fuzziness and the performance of fuzzy classifiers. Furthermore, we also adopted an improved differential evolution algorithm to solve the weight optimization problem. In details, this thesis has completed the following works.(a) Firstly, we propose a multiple evolutionary strategies based hybrid differential evolution(MEHDE), and then use it to determine the feature weights for weighted fuzzy clustering.(b) Secondly, in order to optimize the network structure and random parameters(including input weights and hidden unit biases) for the more complicated Extreme Learning Machine(ELM), we adopte a self-adaptation technique for both the evolutionary strategies and the controlling parameters in the MEHDE algorithm, called Sa-MEHDE. Then, a 2-staged evolutionary ELM(E-ELM) based on Sa-MEHDE is proposed.(c) Thirdly, based on the analysis of the relationship between weights and fuzzy reasoning system, we propose a weight tuning model based on the well-known fuzzy entropy maximization principle.(d) Finally, we study the influence of the fuzziness of fuzzy base classifier’s output vectors to the generalization capability of an ensemble learning system. We also find several significant conclusions by experimental verification or theoritical proof. These conclusions provide suggestions on the selection of base classifiers and optimization of system parameters.Major contributions of this thesis are summarized as follows.(1) We propose the MEHDE algorithm to learn the feature weights of weighted fuzzy clustering. Comparing to existing algorithms, the MEHDE not only fits for globle search, but also fits for local search. Therefore, the MEHDE usually yields a better search performance. Meanwhile, it does not increase the computational cost significantly.(2) Based on the Sa-MEHDE algorithm, a 2-staged evolutionary ELM is proposed to obtain an ELM automatically. In stage 1, only the network architecture is optimized and convergences to a small space based on a simplified Sa-MEHDE; then in stage 2, both the network architecture and the random parameters are optimized simultaneously. Our method greatly reduces the searching space of stage 2 because the number of random parameters is strongly depending on the number of hidden units. So, our method automatically determines the network architecture and random parameters of ELM. Meanwhile, it also alleviates the complexity and the redundancy of the search space. Compared to the basic ELM, our method usually obtains a smaller SLFN with better or comparable performance.(3) For the weighted fuzzy reasoning system, we propose a weight optimization model based on the well-known fuzzy entropy maximization principle. In contrast to most traditional methods that minimizing a training error or a validation error, the weights of weighted fuzzy rules are regarded as tunable parameters. Then, we determine a set of initially weighted fuzzy rules(with all weights equal to 1) by any selected fuzzy rule extraction method. Under the condition that these fuzzy rules satisfy all known constraints(e.g. all training samples are correctly classified), these weights will be optimized by maximizing the fuzzy entropy of the fuzzy rule reasoning system on the training set. Our numerical simulations also verify that the proposed model avoid over-fitting in a great extent, so it obtains a better classifier.(4) For a fuzzy classifier based ensemble learning system, we first study the relationship between the uncertainties(e.g., fuzziness, ambiguity) of fuzzy classifier’s output vectors and the testing accuracy of the ensemble learning system. Interesting conclusions have been summarized as follows. a) For classification problems with complex decision boundary, fuzziness samples are more likely to be misclassified in comparison to small fuzziness samples. b) The set of samples located near to the decision boundary is identical to the set of samples with high fuzziness, but the one-to-one mapping between them is difficult to find and depends on the definition of boundary samples. 3) While a training accuracy is acceptable, we believe that the classifier with higher fuzziness output has a better generalization for complex decision boundary problems. This is experimentally verified in this thesis.
Keywords/Search Tags:Differential evolution, Machine learning, Weight learning, Extreme learning machine, Ensemble learning, Generalization capability, Weighted fuzzy rule system
PDF Full Text Request
Related items