Font Size: a A A

Research On Small-sample Image Classification Based On Neural Networks

Posted on:2021-04-06Degree:MasterType:Thesis
Country:ChinaCandidate:Y P QiuFull Text:PDF
GTID:2428330623983950Subject:Internet of Things works
Abstract/Summary:PDF Full Text Request
Small-sample image classification tasks based on deep learning is a key problem in the computer vision field,in practical application,the difficult y of data collection and huge cost of labeling result in insufficient data samples,moreover,deep neural networks contain a large number of parameters,when the size of training samples is small,the model will reject samples which slightly biased due to the complex decision surface in the hypothesis space,this will cause the overfitting problem and affect the classification accuracy.Aiming at these problems,this work studying how to learn enough knowledge from few samples and building an accurate model,this paper focuses on the study of image classification methods based on neural networks for small samples from three aspects: transfer learning from large data sets to small data sets,loss functions in the neural network,and the model ensemble.The main work is as followed:(1)This article studies how to learn from existing big data tasks and transfer the model into the small sample classification task.The insufficient data in small-sample dataset leads to the inadequate training of model and making a bad classification performence,based on the idea of transfer learning,this article studies how to fine-tuning the pre-trained model on the large data set and use it to extract the features of the small-sample data,to make full use of the attributes and generalized representations that the model has learned.(2)Aiming at the confusion between similar categories,which makes it difficult for the model to extract discriminative features,this paper proposed a loss functio n applied into neural networks,introducing a confusion rate weighted soft label as a control of similarity measures between categories.The proposed loss function enables the model to dynamically pay attention to all samples,especially for those samples that are easily misclassified during the training process,forcing the model to learn distinctive features,reducing co nfusion between categories,and expanding the variance between classes.Experimental results on the LabelMe dataset and the Caltech101 dataset show that the dynamic attention loss function is more generalized and has better classification performance compared with alternatives.(3)Aiming at the problem that the neural network is easy to overfitting when the size of the data set is small,this paper proposed a new ensemble method which is named parallel snapshot ensemble,providing a new option for recombining the basic models of snapshot ensemble.This method selects a better base model(by Relative Mean Kullback-Leibler Divergence,a new measurement divergence for the diversity)and reorganizes these base models instead of using all base models as snapshot method did.Experimental results on four common used small sample image classification datasets show that the Parallel Snapshot Ensemble is an effectively approach in alleviating overfitting.
Keywords/Search Tags:Small-sample image classification, Neural network, Transfer learning, Loss function, Ensemble learning
PDF Full Text Request
Related items