Extreme learning machine which is used to train single hidden layer feedforward network is a new learning algorithm. Researchers pay more attention to extreme learning machine due to lts simplification, fast learning speed and good performance. We can deal with some data that needed to be eliminated, such as dirty data and redundant data, after extreme learning machine is used to train the single hidden layer feedforward network. However, it will consume much time if we use extreme learning machine to train the whole network again, especially for big data. This paper proposes one online negative incremental algorithm and substitutive incremental algorithm. After training data are eliminated or replaced, the network need not to be trained again and the final output weights need not to be computed entirely, which only need to be updated based on the original output weights. Furthermore, this paper also proposes incremental hidden layer nodes algorithm based on error minimized extreme learning machine. After hidden layer nodes are increased, the test outputs are updated based on the original outputs, which need not to be computed entirely. The contents are as follows:In Chapter 1, we introduce the principle and research status of extreme learning machine, summarize the research contents and research significance of this paper, and compare extreme learning machine with the traditional neural network algorithm.In Chapter 2, we study the online negative incremental algorithm. After eliminating data, the final output weights are computed based on the original output weights by using the incremental learning method. Likewise, the tost outputs are updated. Furthermore, the computational complexity and simulation experiment show that the online negative incremental algorithm performs better than the original extreme learning machine with respect to speed.In Chapter 3, we study the substitutive incremental algorithm. After replacing data, the final output weights are computed based on the original output weights by using the online negative incremental algorithm and the incremental learning method. Likewise, the test outputs are updated. Besides, the computational complexity and simulation experiment show that the substitutive incremental algorithm has good performance.In Chapter 4. we study the incremental hidden layer nodes algorithm. After in- creasing hidden layer nodes, the output weights are computed based on error minimized extreme learning machine. And then the test outputs are updated. Moreover, the com-putational complexity and simulation experiment show that the incremental hidden layer nodes algorithm has good performance with respect to speed.In Chapter 5, we introduce three kinds of incremental learning algorithms based on extreme learning machine. |