Font Size: a A A

Researches On Robust Classification Methods With Label Noise

Posted on:2024-09-24Degree:DoctorType:Dissertation
Country:ChinaCandidate:L WangFull Text:PDF
GTID:1528307184965039Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
With the emergence of large-scale high-quality annotated datasets,the field of artificial intelligence,represented by deep learning algorithms,has undergone significant advances and broad applications in various domains such as computer vision,speech recognition,and natural language processing.However,acquiring large-scale high-quality annotated data comes at a considerable cost in terms of human resources and time.Furthermore,gathering materials and annotating data from the Internet necessarily introduces a significant quantity of noise labels.Deep learning models are prone to overfitting noisy labels due to the large number of parameters,which significantly reduces the accuracy and generalization performance of the model.Therefore,researches on robust learning algorithms that can deal with label noise are important both theoretically and practically.This study starts from the three main noise label learning paths of robust loss function design,loss function weighted correction,and noise label dynamic correction,and explores reducing the negative impact of label noise on deep models.The main research contributions and innovations of this study are addressed below.(1)Existing robust loss functions do not completely utilize all category information of samples to assist in countering label noise,this research offers a noise-robust learning method based on uncertainty-aware cross-entropy.The method employs a negative shift affine transformation on classical cross-entropy,lowering the loss gradient of samples with low projected probability on the annotated class and,as a result,decreasing susceptibility to label noise.Simultaneously,the method innovatively proposes an uncertainty-aware term,which combines the predicted probabilities of annotation classes and the entropy of other classes’ predicted probabilities.This term penalizes the loss gradient for high-uncertainty samples and rewards the loss gradient for low-uncertainty samples,effectively mitigating the negative impact of label noise on network learning.Furthermore,this paper theoretically proves that the proposed method is robust to label noise,and multiple experiments on synthetic and real-world noisy labeled datasets confirm its strong noise resistance and superior performance compared to similar methods.(2)Existing loss function reweighting approaches suffer from inaccurate noise transition matrix estimation and poor performance on multi-class tasks.This study proposes a noiserobust learning method based on probability distribution reweighting.The method utilizes a probability distribution remapping method predicted by the network to estimate the label-noise transition matrix,which significantly reduces the estimation bias.To fully exploit the interclass transition information contained in the noise transition matrix and improve the generalization performance of the loss function correction method in multi-classification tasks with label noise,this paper extends the loss function reweighting method for binary tasks to arbitrary noise multi-classification tasks.In addition,the estimated noise transition matrix is used to correct the network output probabilities,further enhancing the anti-noise performance of deep models.Extensive experimental comparisons are conducted on artificially synthesized noise datasets,and the proposed method is applied to large-scale data-assisted few-shot fine-grained classification tasks with real-world label noise,demonstrating strong noise robustness and great adaptability to different types of label noise.(3)Existing noise label correction methods cannot simultaneously update network parameters and correct noise labels during the training process.This paper proposes a dynamic noisy labels correction method based on the self-reflection mechanism.This method constructs a dynamic iterative function to effectively match the change of the noise label correction ability of the network model in different training stages,and to make full use of the knowledge of the network model in each training stage to assist in correcting noise labels.Thus,this method effectively overcomes the shortcomings of existing label correction methods that need to divide the training process into three stages(warm-up,label correction,and retraining).Furthermore,the proposed method is applicable to predefined network frameworks and different types of label noises,with strong versatility and less computational overhead,suitable for predefined network frameworks and different types of label noises.Through extensive experimental comparisons on artificially synthesized and real-world noisy labeled data,it is verified that the method designed in this paper has good label error correction ability,and its performance surpasses the current similar noise label correction methods.
Keywords/Search Tags:Noisy Labels, Deep Learning, Robust Loss Function, Noise Transition Matrix, Label Correction
PDF Full Text Request
Related items