Font Size: a A A

Classifiers Design Based On A Fast Adaptive Algorithm

Posted on:2019-04-09Degree:MasterType:Thesis
Country:ChinaCandidate:L QinFull Text:PDF
GTID:2428330599963849Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
With the advent of the information age and the era of big data,the classification problem has become increasingly complex.High-dimensional large-scale sample and online application requirements require researchers to choose a suitable method to reach a compromise between classifier accuracy and “dimension disaster” in the process of classification.In order to reduce the storage requirements of classifiers for online applications,and also to give classifiers the ability to process high-dimensional samples,this thesis considers the two independent processes of feature selection and classification in one framwork.This thesis first establishes a piecewise linear classifier from the perspective of classification boundary.Based on the theory of Support Vector Machine(SVM),the classifier uses Adaptive Hinging Hyperplanes(AHH)model to partition the sample region so that the samples are linearly separable in each subregion.Thanks to a special training algorithm,the classifier combines the two processes of feature selection and classification,and successfully reaching a compromise between accuracy and “dimension disaster”.Then,from the perspective of neural network,this thesis also establishes a neural network classifier which is adaptive.The classifier uses the basis function of the AHH model as the activation function of the network and establishes a single hidden layer neural network with high model efficiency.Compared with other neural network classifiers,this classifier outperforms in training speed and test speed,so this classifier is more suitable for online applications than other neural networks.Finally,in order to train the parameters and coefficients of the above two classifiers,this thesis also designs a fast adaptive training algorithm.The algorithm consists of two parts: the forward branching process and the backward pruning process,and both processes adopt a stepwise training method.By adopting this stepwise training method,the algorithm can find an optimal or approximate optimal solution for the model.In addition,in order to speed up the training speed of the algorithm,this thesis also adds a heuristic search operator at each step of the branching process,which reduces the number of searches in the algorithm.The algorithm is applied to the first and second classifiers and makes them adaptive.Due to adaptiveness,the first classifier finds a compromise between accuracy and feature quantity,and the second classifier enjoys a more efficient network structure than other network classifiers.Compared with other classifiers,the major advantage of the two classifiers built in this thesis is that they are adaptiveness.Not only can they implement feature selection,but they can also perform classification tasks efficiently.Efficient model ensures that the test speeds of the two classifiers established in this thesis is comparable to other classifiers.
Keywords/Search Tags:Classifier, Feature Selection, Neural Network, Piecewise Linear, Adaptiveness
PDF Full Text Request
Related items