Font Size: a A A

Research On Incremental Extreme Learning Machine Algorithms

Posted on:2021-02-19Degree:MasterType:Thesis
Country:ChinaCandidate:P Y ZuoFull Text:PDF
GTID:2428330611473201Subject:Software engineering
Abstract/Summary:PDF Full Text Request
The hidden layer parameter input weights and offset values of the extreme learning machine are randomly selected,and the only parameter that needs to be set is the number of hidden layer nodes.The output weights are obtained by the least square method,which avoids the problems of multiple iterations and local minimization,and has good generalization performance and extremely high learning efficiency.It is widely used in medical biology,computer vision and image processing.In recent years,there have been many studies on finding the optimal number of hidden layer nodes and the online addition of training samples to the model.In this paper,the research on the incremental extreme learning machine is as follows:The inverse-free matrix extreme learning machine finds the optimal number of hidden layer nodes by gradually increasing the hidden layer nodes.This paper extends it into its inverse-matrix-free online sequential version called the inverse-matrix-free online sequential extreme learning machine(IFOS-ELM).The algorithm firstly finds the optimal number of hidden layer nodes by using the Shure complement formula,and then gradually increasing datasets,by using Sherman-Morrison-Woodbury equation to avoid the iterative calculation of output weight matrix about the training samples.The detailed derivations of the proposed machine IFOS-ELM are accordingly given.The experimental results on different types and sizes of datasets show that IFOS-ELM indeed is very suitable for the datasets which are gradually generated in an online way,in the sense of both fast training and promising performance.However,the IFOS-ELM algorithm has a problem of poor classification accuracy for class unbalanced data.To this,this paper further proposes an incremental online sequence extreme learning machine(IOS-ELM)for class imbalance.The basic idea is to:(1)push the separated hyperplane to the majority-class by adjusting the balance factor according to the class imbalance ratio;(2)determine the appropriate number of hidden nodes by increasing the hidden nodes incrementally with the use of the Schur complement or the Sherman-Morrison-Woodbury formula,thereby improving the online learning ability of IOS-ELM.The experimental results on fourteen binary-class and multi-class imbalanced datasets show that the proposed machine IOS-ELM has better generalization capability and classification performance than the comparative methods.
Keywords/Search Tags:extreme learning machine, inverse-matrix-free, online sequential learning, class imbalance learning, incremental
PDF Full Text Request
Related items