Font Size: a A A

Research On Softmax Complementary Class Weight Design Algorithm For Long-tail Image Classificatio

Posted on:2024-08-01Degree:MasterType:Thesis
Country:ChinaCandidate:L Y HuFull Text:PDF
GTID:2568307067473684Subject:Communication Engineering (including broadband network, mobile communication, etc.) (Professional Degree)
Abstract/Summary:PDF Full Text Request
With the rapid development of deep learning,image classification techniques based on balanced datasets have made great progress.However,large-scale image datasets in the real world usually exhibit a long-tail distribution,that is,a few categories occupy a large number of samples in the training dataset,while most categories occupy a relatively small number of training samples.Since traditional methods for image classification heavily rely on balanced datasets for model training,these methods cannot perform well in long-tailed distribution datasets.Therefore,how to effectively solve the long-tailed classification problem has attracted the attention of many researchers.Based on this,this paper proposes two methods to solve the long-tail image classification problem by analyzing the sample information of the ground-truth class and its complement classes(all classes except the ground-truth class)in the long-tailed dataset,as follows:(1)Firstly,the paper proposes a weighted complement softmax method based on the optimization of sample gradient.In this work,we present a balanced complement loss(BACL)to balance gradient ratios between complement classes and ground-truth class.Specifically,we introduce an adaptive weight for complement classes in the softmax loss,to mitigate the overwhelming suppression of gradients from the complement classes for the tail classes.After that,we further propose a joint training loss by combining our method with normalized complement entropy(NCE)via a novel double-angle sine decay strategy.The proposed decay strategy is applied to adjust the contribution between the BACL and NCE losses in the different training stages.With the joint training loss,the model first learns useful sample information from the complement classes and then gradually turns its attention to the classification task.Experiments on CIFAR-10-LT,CIFAR-100-LT,SVHN-LT and Image Net-LT datasets are conducted to reveal the significant effect of the proposed methods.(2)Secondly,the paper proposes a weighted margin Softmax method based on the prior distribution of samples.The proposed method includes an adaptive margin loss(AML)and a complement sample re-weighting(CRW)strategy.In the former,by adding a different margin for the ground-truth class and complement classes,the adaptive margin loss will encourage the model to construct a more reasonable feature distribution during the training process,thus reducing inter-class competition among different classes and improving the discriminative feature learning ability of the model.In the latter,the complement sample re-weighting strategy assigns a larger weight coefficient to the tail classes through the sample relationship of complement classes,and then achieves the loss balance between the different classes to alleviate the long-tailed problem.Finally,this work conducts experiments on CIFAR-10-LT,CIFAR-100-LT,SVHN-LT,MNIST-LT and Tiny-Image Net-LT datasets to prove the effectiveness of the proposed methods.In conclusion,this paper proposes two weighted Softmax methods based on sample gradient optimization and sample prior distribution to address the long-tailed image classification problem.Extensive experimental results demonstrate that the proposed methods significantly improve the classification performance of the long-tailed classification model,and thus have good academic and application value.
Keywords/Search Tags:long-tailed distribution, complement classes learning, gradient optimization, loss re-weighting, image classification
PDF Full Text Request
Related items