Font Size: a A A

Research On Classification Algorithm Of Semi-supervised Extreme Learning Machine Based On Sparse Bayesian

Posted on:2018-01-20Degree:MasterType:Thesis
Country:ChinaCandidate:D H ZhaoFull Text:PDF
GTID:2348330518997702Subject:Computer system architecture
Abstract/Summary:PDF Full Text Request
Mining the knowledge from enormous amounts of raw information to facilitate quick extraction of target information, has become a popular topic in the machine learn-ing community in recent years. Semi-supervised learning attracts great attention to com-bine the information from a few labeled data and a large amount of unlabeled data to improve the classification performance. However, the computational complexity of traditional semi-supervised classification algorithms, such as Laplacian Support Vector Machine(O(N3), in which N is the number of input samples), are usually prohibitive for large datasets. Extreme Learning Machine (ELM) is a new kind of single-hidden layer feedforward neural network with an extremely low computational cost. In order to reduce the computational cost for semi-supervised learning, Semi-supervised Extreme Learning Machine (SSELM) which is based on the ELM framework has been proposed.SSELM is able to combine the advantages of both ELMs and semi-supervised leraning.It is endowed with learning efficiency of ELMs and is able to make use of both labeled and unlabeled data by semi-supervised learning. However, the accuracy of SSELM is sensitive to the number of hidden neurons such that SSELM often needs a large and complex model to approximate the data.SSELM cannot perform well in terms of sparseness and classification accuracy. In this thesis, a sparse Bayesian based semi-supervised learning algorithm is proposed to solve these problems, which is based on the sparse Bayesian inference and SSELM. This novel algorithm is called sparse Bayesian semi-supervised Extreme Learning Machine(SBSSELM). It can make full use of unlabeled data and improve the classification accu-racy through introducing a manifold prior on the weights of the output layer in the ELM.In addition, by automatically pruning most of the redundant hidden neurons during the learning procedure, it can achieve a compact and relatively insensitive model to the number of hidden neurons. The experiments on benchmark data sets demonstrate that SBSSELM is able to yield a sparse model with comparable classification accuracy in comparison with some state-of-the-art semi-supervised classifiers and maintains more sparseness and stability than SSELM.The major work and innovations of this thesis are summarized as follows:(1) SSELM-based classification algorithms cannot perform well in terms of sparse-ness and stability. We propose to employ a Sparse Bayesian Learning (SBL) approach to learn the output weights of SSELM classifier to enhance the sparseness and stabil-ity of the SSELM classifier. Hence, SBSSELM has the advantages of both SBL (high sparsity) and SSELM (universal approximation and learning efficiency).(2) The output weights of traditional SSELM solved by least squares, which easily suffers from overfitting. By maximizing the marginal likelihood, SBSSELM shows less tendency to overfit noise in the training data.(3) SBSSELM has a lower computational cost than SSELM. The time complexity of SBSSELM is O(L3+ N log N), where L is the number of hidden neurons, N is the number of input samples, N log N is the time complexity of constructing the Laplacian matrix. L3 is the time complexity of calculating the output weights. Therefore, com-pared with the time complexity of O(N3) of SSELM, SBSSELM has a lower training time.(4) The experiments on UCI data sets demonstrate that SBSSELM can yield a sparse model with comparable classification accuracy in comparison to some state-of-the-art semi-supervised methods. Moreover, we carry out additional experiments with a real-word email classification data set. As shown in our experiments, the proposed SBSSELM model can also achieve high accuracy and sparseness.
Keywords/Search Tags:Semi-supervised Learning, Sparse Bayesian Learning, Classification, Extreme Learning Machine, Manifold Regularization
PDF Full Text Request
Related items