Font Size: a A A

Study On The Incremental Machine Learning Algorithms

Posted on:2014-01-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:R HuFull Text:PDF
GTID:1228330395983694Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of the Internet, it becomes very easy to acquire data in many applications. But how to get useful information from the increasing data becomes a hard problem to traditional bathing learning techniques. With the scale of data becoming increaseing incessantly, the need for time and space will be increased rapidly too. The final result is that the speed of learning will not catch up with the speed of updating. Machine learning is an effective way to solve this problem. But the traditional machine learning method uses batch method. All data must be available before learning begins. In order to meet the demand of online learning, one may needs to abandon the former study results and retrain the network to learn more, which requires a long time and space. Therefore, to study the incremental method is an urgent need, which can be gradual to update knowledge, and can be modified and strengthen previous knowledge, makes the updated knowledge adapt to the new added data.This dissertation probe into in-depth the incremental singular value decomposition and the incremental fuzzy neural network. The main contributions of the thesis are as follows:1. Presents a Candid Covariance-free Incremental Singular Value Decomposition.The traditional approach for SVD is a batch method which requires that all the training data be available before computing begins. So it can’t meet the on-line learning requirements. This dissertation develops a candid covariance-free incremental SVD. The covariance matrix is estimated to be using current sample data. We can analyze how to compute the first eigenvector with the current arrival high dimensional data, and show intuitive and theoretical explanation. This method generates the "observations" in a complementary space for the computation of the higher order eigenvectors, so the orthogonality of the eigenvectors can be kept all along. It reduces the cost of time and space.2. Presents a Pruned-free Incremental Sequential Learning Fuzzy Neural NetworkIdentification of the structure of fuzzy neural network is time-consuming. In order to avoid producing redundancy rule and to improve learning efficiency, the prune approach may be introduced into the rule growing process. This dissertation presents an incremental sequential learning algorithm with no rule pruning which uses the error rate of descent to define the contribution of rule to system as the rule growing criterion. So during the growing process, no redundant rule will be generated. It is based on current arrival data to compute the contribution of rule to system, so this method is an incremental method.3. Presents an Optimal Incremental Extreme Learning Fuzzy Neural NetworkThe Extreme Learning Machine (ELM) is a simple yet effective learning algorithm for training SLFNs with random hidden nodes. ELM has been shown to be accurate and fast both theoretically and experimentally. We extended ELM to on-line incremental manner. First randomly generates a set of simple antecedents and random values for the parameters of input membership functions. Then SVD is used to rand the fuzzy basis functions. Then the best number of fuzzy rules is selected by performing a fast computation of the leave-one-out validation error. Finally, the consequents parameters are determined analytically. A comparison is performed against well known neuro-fuzzy methods. It is shown that the method proposed is robust and competitive in terms of accuracy and speed.4. Presents a Self Adaptive Incremental Learning Fuzzy Neural Network based on the Significance of a neuronIn fuzzy neural network, a fuzzy rule may be active at the beginning. Then it becomes less important to system. This dissertation presents a self-adaptive sequential incremental learning fuzzy neural network (SAIL-FNN) algorithms based on the influence of rule. The algorithm uses the concept of "Significance" of a neuron and links it to the learning accuracy. The "Significance" of a neuron is defined by its contribution to the network output over the current input data received. Only the significance-value of a neuron is larger than a threshold value, a new neuron will be considered to be added. At the same time, all the existing neurans are checked, if there is a neuron whose significance-value is less than a predefined value, this neuron will be removed. Then, the extended kalman filter is used to update the parameters. The results of simulation experiments indicate that the SAIL-FNN algorithm can provide comparable generalization performance with a considerably reduced network size and training time.5. Propose an Algorithom of Face Recognition based on Incremental Learning Fuzzy Neural Network and Harr waveletIn order to improve the quality of samples thereby to enhance the accuracy of recognition, this dissertation proposes a novel way for facial feature extraction based on incremental learning fuzzy neural network. First, the Harr wavelet is applied to the decomposition of typical human face. The high-frequency which is an important part of facial feature is preserved as a part of facial feature. The low-frequency is reduced the dimensions by applying Fisher Linear Discriminate (FLD). The part of preserved high-frequency combined with the part of low-frequency after being reduced dimensions is used as input sample of a fuzzy neural network. The self-adaptive incremental leaning algorithm proposed in this dissertation is applied to train the network. The result of simulation experiments shows that the trained fuzzy neural network after using Harr for preprocessing can achieve higher accuracy than the fuzzy neural network with no Harr being used.
Keywords/Search Tags:incremental learning, singular value decomposition, fuzzy neural network, Fisher Linear Discriminant Analysis, wavelet transform, Face Recognition
PDF Full Text Request
Related items