Font Size: a A A

Using Self-organizing Incremental Neural Network In Unsupervised Feature Learning Of Single-Layer Network

Posted on:2016-02-05Degree:MasterType:Thesis
Country:ChinaCandidate:J LuFull Text:PDF
GTID:2308330461956534Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Much of recent works in machine learning has focused on learning good fea-ture representations from unlabeled input data for higher-level tasks, such as non-linear classification and image recognition. Current solutions typically learn multi-level representations by greedily "pre-training" several layers of features, one layer at a time, using an unsupervised learning algorithm, such as sparse auto-encoder, restrict-ed boltzmann machines and the k-means algorithm. These methods words well due to their data representation ability. But they all share some drawbacks, the low training speed or the painful parameter determination. In this paper we use Self-Organizing Incremental Neural Network(SOINN) in unsupervised feature learning of single-layer network. Two products of this work are radius basis function neural network using SOINN(SOINN-RBF) and convolutional neural network using SOINN(SOINN-CNN).In the SOINN-RBF we firstly learn the representative points from unlabeled da-ta with SOINN. Then we calculate the "gaussian kernel distance" from input data to representative points as the feature vectors. Lastly we train the linear regression pa-rameters with the least square method. In the experiments, the SOINN-RBF achieve the same result with SVM on two real-life data sets, Thyroid and Insurance. We also combine the unsupervised procedure of the neural network with supervised procedure of classification. This work produces the online SOINN-RBF, which is much faster than the other online RBF algorithms.In the SOINN-CNN we extract image patches with the same size as the unlabeled training data. The SOINN algorithm is used in the unsupervised feature learning proce-dure, which produces the convolution kernels. Then the original images are convolved with these convolution kernels. The features vectors are generated by activating and pooling the convolved images. At last, the linear classifier is trained by the S VM algo-rithm. In the experiments, we analyse the positive effect of normalize and whitening procedure on unsupervised feature learning. We achieve the accuracy of 78.09% on data set CIFAR-10. This result shows that the SOINN-CNN algorithm is, if not better than, equivalent to other unsupervised algorithms.To conclude, the algorithm proposed by this paper is an one-layer unsupervised feature learning algorithm. The first step is learning representing knowledge from unlabeled data using Self-Organizing Incremental Neural Network. The second step is extracting feature vectors with the representing knowledge. In the last step we train the classifier on the feature vectors. This algorithm is faster in training and easier to tune than other unsupervised feature learning algorithms. The most important characteristic is this algorithm can learn new knowledge incrementally from new data.
Keywords/Search Tags:unsupervised feature learning, self-organizing incremental neural net- work, radius basis function neural network, convolutional neural network
PDF Full Text Request
Related items