Font Size: a A A

Research On Image Classification Of Small-scale Dataset

Posted on:2020-08-17Degree:MasterType:Thesis
Country:ChinaCandidate:T Z ZhangFull Text:PDF
GTID:2428330623467020Subject:Software engineering
Abstract/Summary:PDF Full Text Request
The state-of-the-art performances on large-scale image datasets such as CIFAR-10,CIFAR-100 and ImageNet are uniformly achieved by deep convolutional neural networks(DCNNs).However,DCNNs are driven by a large number of labeled training samples.On small-scale image datasets of certain fields,the training of DCNN models are difficult,and the results are poor due to the lack of labeled samples.What is more,misclassification occurs easily on some small-scale image datasets that require finegrained classification,because the samples are highly similar to each other on these datasets.To solve the first problem,this paper adopted a new model named capsule network which was used on the Caltech-256 and Oxford Flower-102 datasets.It performed better than CNN on small-scale image datasets.And a nuclear FCM clustering algorithm which optimized regularization parameters was incorporated in this capsule network as a replacement for the old one.This streamlined the dynamic routing process.The results of comparative experiments showed that the application of the capsule network led to about 4% an improvement on the Caltech-256 dataset and about 2% an improvement on the Oxford-102 dataset(from which background interference had been removed)compared with the baseline CNN model.And the capsule network model of this paper achieved about 0.2% an improvement on the original capsule network.Experiments showed its advantages on these two small-scale image datasets compared to the DCNN model,and indicated its effectiveness.To solve the second problem,a two-level hierarchical feature fusion learning method based on transfer learning was proposed in this paper.Firstly,a DCNN model which had been pre-trained on large-scale image datasets with sufficient samples was applied to the Caltech-256 and Oxford Flower-102 datasets.Then the common feature representations between source domain and target domain were learned by modifying the final objective function to fine-tune the parameters of the network.The steps above completed the extraction of first-level general features.Since most of the classification errors in small-scale image dataset are occurred among image categories with high similarity,this paper divided the high-similarity samples into special training subsets using spectral clustering based on the idea of fine-grained classification.The second-level special features were extracted by fine-tuning the parameters of this network on the trained model.Furthermore,the first-level general features and the second-level special features were merged and then sent to the classifier to complete the final training process.The comparison experiment results showed that the model using this two-level hierarchical feature fusion learning method led to an improvement of 0.73%?0.86% on the unused one,which proved that the two-level hierarchical feature learning method can effectively tackle the misclassification problem.This method alleviated the misclassification phenomenon that‘s likely to occur on small-scale image datasets that needed fine classification,and improved the accuracy of image classification.
Keywords/Search Tags:Image classification, Small-scale dataset, Capsule network, Transfer learning, Two-level feature fusion
PDF Full Text Request
Related items