Font Size: a A A

Research On Node Selection And Output Weight Calculation For Extreme Learning Machine

Posted on:2020-06-20Degree:MasterType:Thesis
Country:ChinaCandidate:Z H LaoFull Text:PDF
GTID:2428330590960954Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
As a special single-hidden-layer feedforward networks?SLFNs?,Extreme Learning Machine?ELM?has been becoming an increasingly significant research topic,due to its unique characteristics,i.e.,fast training,good generalization.Unlike backpropagation?BP?-based SLFNs,ELM's parameters need not to be iteratively tuned,the input parameters of hidden layer are randomly generated,and the output parameters of hidden layer are analytically calculated.However,due to randomly generated input parameters,some of the hidden nodes in ELM may play a very minor role in the network output.This may eventually increase network complexity and degrade performance.Another important factor affecting ELM's performance is the way in which output weights are calculated.This article summarizes the basic problems and improvement directions of ELM at the current stage,and launches research in hidden node selection and output weight calculation optimization.The main research work is as follows:?1?Multiresponse sparse regression based incremental regularized extreme learning machineIncremental ELM is an important ELM algorithm,which uses incremental learning to avoid the problem of hidden node number setting.In this article,considering that there may be redundancy between candidate hidden nodes,the Multiresponse sparse regression?MRSR?is introduced into the candidate hidden node selection of incremental ELM.In addition,to improve robustness and flexibility,7)2 constraint is added to the loss function of incremental ELM,and an iterative update formula for output weight is obtained based on the inversion of2×2 block matrices.Experiments show that the above measures are effective,and multiresponse sparse regression based incremental regularized extreme learning machine can obtain good experimental results.?2?Enhanced random search based hierarchical extreme learning machine for representation LearningHow to improve the performance of ELM-based multi-layer neural network is the key to extending ELM's application range to complex problems,i.e.,image classification,speech recognition.In this article,we delved into the framework of Hierarchical ELM and proposed a new data preprocessing to make the network structure more flexible.At the same time,incremental learning method is introduced into extreme learning machine auto-encoder?ELM-AE?.The hidden nodes selection is added,and the output weight calculation is optimized to improve the effectiveness of ELM-AE's hidden nodes.In addition,a supervised convolution kernel selection is added to the Hierarchical ELM training process.Experiments show that the proposed ELM-AE has better approximation ability,and the enhanced random search based hierarchical extreme learning machine performs well on image classification problems.
Keywords/Search Tags:Extreme Learning Machine, Multi-layer Neural Network, Regularization, Incremental Learning
PDF Full Text Request
Related items