Font Size: a A A

Research On Random Hierarchical Neural Network Based One-class Classification

Posted on:2021-04-03Degree:MasterType:Thesis
Country:ChinaCandidate:H Z DaiFull Text:PDF
GTID:2428330605451225Subject:Control Engineering
Abstract/Summary:PDF Full Text Request
The One-Class Classification algorithm aims to establish a classification model for target data,learn the characteristics and recognition models of target data,and detect abnormal samples.It has superior performance in many applications,such as anomaly detection,unbalanced dataset classification and so on.Representative oneclass classification algorithms include support vector machine one-class classification(OC-SVM),support vector data description(SVDD),Naive Parzen density estimation,autoencoder and so on.However,the existing one-class classification algorithms are less effective in classification of complex or high-dimensional large datasets including severe outliers.Therefore,how to improve the performance of existing one-class classification algorithms to provide possibilities for future practical applications is an urgent problem to be solved.Based on the performance of the One-Class Extreme Learning Machine(OC-ELM),this paper deeply studies the one-class extreme learning machine and its improved algorithms.In this paper,the OC-ELM and its improved algorithms are deeply studied.The innovation of the thesis focuses on improving the generalization and characterization ability of the one-class classification model for complex data as well as dealing with severe out-of-group interference noise.The main work and results are as follows:1.In view of the shortcomings of shallow networks in representing complex or high-dimensional large datasets,a deep neural network feature learning model based on multi-layer stacked autoencoders is constructed.In order to speed up the feature learning,the autoencoder bases on the extreme learning machine to train the model and further expanded to the multi-layer neural networks(ML-OCELM).The experimental data analysis of 13 kinds of UCI datasets and 11 classes of urban noise detection shows that compared with the traditional OC-ELM,the proposed one-class classifier of MLOCELM can effectively improve the anomaly detection performance of the model.2.Aiming at the shortcomings of ML-OCELM,which is proposed in 1 above,such as sensitive to the number of neurons in the hidden layer and has poor model optimization caused by random projection,this paper further studies the ML-OCELM based on kernel learning(MK-OCELM).The kernel function is introduced to calculate the inner product of the original data in high dimensional space.3.In order to improve the robustness of traditional ELM algorithms to non Gaussian noise,this paper further studies single-layer and multi-layer one-class random neural networks based on maximum correntropy criterion(MCC).The OC-ELM loss function is reconstructed by using MCC to make it more robust.On this basis,it is further extended to the multi-layer network structure.4.In order to further improve the robust performance of the one-class classification algorithm based on MCC in the proposed 3 above,this paper further studies the MCOCELM-VC and HC-OCELM-VC based on the maximum correntropy with variable center(MCC-VC)of Gaussian kernel function.By using a variable center Gaussian kernel function instead of the zero-mean Gaussian kernel function in the original correlation entropy,the actual error distribution is better matched.5.Different from the method in proposed 4 above,this paper analyzes the robust performance of one-class classification algorithm based on MCC in different views,and studies the one-class classification algorithm based on maximum mixture correntropy(MMC-OCELM and HMC-OCELM).In the original correntropy,only one Gaussian kernel function is used.In MMC-OCELM and HMC-OCELM,multiple kernel functions are combined by affine change to replace the original single kernel function,thus improving the robust performance of the algorithms.
Keywords/Search Tags:One-class Classification, Anomaly Detection, Autoencoder, Kernel Learning, Maximum Corentropy Criterion, Extreme Learning Machine
PDF Full Text Request
Related items