Font Size: a A A

Universal Domain Adaptation Under Noisy Environments

Posted on:2023-03-22Degree:MasterType:Thesis
Country:ChinaCandidate:Y HuangFull Text:PDF
GTID:2558307097479014Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
To alleviate the dependence of deep learning on large-scale well-labeled training data and relax the assumption that train set and test set are drawn from the same distribution,domain adaptation leverages rich labeled data from source domain to learn an accurate clas-sifier for an unlabeled target domain.Traditional domain adaptation methods assume that source domain datasets are free of noises and with accurate annotations.In noisy scenarios,the direct application of domain adaptation algorithms will obviously result in serious nega-tive transfer.Domain adaptation under noisy environment transfers useful knowledge from the source domain with label noises or/and features noises,and reduces the cost of collect-ing the high-quality datasets that relevant enough to target domain.However,these methods assume to obtain the accurate noise rate in advance to reduce the negative transfer caused by noises in source domain,which limits the application of these methods in the real world where the noise rate is unknown.Therefore,we propose single source and multi-source universal domain adaptation algorithms under unknwon noisy environments,respectively.(1)In domain adaptation problem under noisy environments,existing works assume to obtain supervised information such as labeled data from target domain and known noise rate.In this paper,we relax the above assumptions by proposing a universal domain adap-tation algorithm called PDCAS based on progressive distillation.To alleviate the negative effect of noisy source samples without any extra supervision,during the process of iterative training,PDCAS distill the correct samples from noisy source domain based on the data distribution characteristics and the discriminate ability of classifier.To maximally align the conditional distributions between domains,PDCAS simultaneously matches the marginal feature distributions and label distributions by intgrating class-balanced sampling strategy into marginal disparity distribution.Extensive experiments on Office-31 and Office-Home datasets demonstrate the robustness and universality of PDCAS under noisy environments compared to state-of-the-art methods.(2)In multi-source domain adaptation problem under noisy environments,we extend PDCAS into multi-source noisy scenarios and propose a novel robust multi-source domain adaptation method called MSPDCAS under noisy environments.In data processing module,MSPDCAS corrects labels for noisy source samples and selects clean samples based on the updated model.It is worthy noted that the model is trained with clean data.In multi-domain alignment module,MSPDCAS maps multiple source and target data into a common feature space with a shared feature extractor,and then aligns the distributions of each pair of source and target domains with multiple classifiers.In classification ensemble module,MSPDCAS obtains the final category prediction by combining the outputs of the classifiers.To demon-strate that our work is independent on the noise distribution,we label three types of noisy datasets with long-tailed noise,step noise and random noise respectively.The experimental results validate the robustness and scalability of MSPDCAS.
Keywords/Search Tags:Domain adaptation, Multi-source domain adaptation, Noisy environments, Universality, Deep learning
PDF Full Text Request
Related items