Font Size: a A A

Research On Unbiased Distillation Method For Recommendation

Posted on:2024-04-20Degree:MasterType:Thesis
Country:ChinaCandidate:G ChenFull Text:PDF
GTID:2568306932454754Subject:Data Science (Computer Science and Technology)
Abstract/Summary:PDF Full Text Request
Recommendation systems have become an important tool for providing personalized recommendations to users,especially in the era of big data.However,as the number of users and items increases,balancing accuracy and efficiency in recommending items of interest to users becomes increasingly difficult.Knowledge distillation is a method for addressing this problem,which can compress large recommendation models into smaller ones while maintaining their accuracy.Although knowledge distillation has made significant progress in the field of realtime recommendation systems,this article found that existing methods still suffer from the problem of amplifying popularity bias,which severely degrades the user experience.To address this issue,this article proposes a step-by-step exploration strategy to develop an unbiased knowledge distillation method for recommendation systems.First,this paper discusses research on using an unbiased teacher model for unbiased distillation.The main reason for amplifying biases in the distillation process is found to be due to biased teacher models.To investigate the impact of an unbiased teacher on distillation,the study employs two bias removal strategies to eliminate the bias of the teacher model,and conducts extensive experiments on two distillation strategies.The results indicate that this method can alleviate bias amplification to some extent,but there are still issues with insignificant bias removal effects and difficulty in tuning the method.Second,this paper discusses research on using inverse propensity scores to achieve unbiased distillation.The goal of this study is to achieve an unbiased distillation strategy independent of the model structure,without intervening in the training of the teacher model.This paper analyzes the reasons for bias amplification by studying the three steps of sampling,weighting,and function design in existing distillation methods,and eliminates the corresponding bias by designing inverse propensity scores to achieve unbiased distillation.In the experiment,the method proposed in this study achieved performance comparable to that of the unbiased teacher method at a lower training cost.Finally,this paper details a new distillation method that uses causal intervention to achieve unbiased distillation.The paper first divides items into multiple groups based on popularity and then extracts ranking knowledge within each group to supervise student learning.By constructing a causal graph to analyze the causal relationships during the distillation process,the paper uses causal inference tools to verify the feasibility of the method and demonstrates its effectiveness through experiments.In summary,this paper discusses unbiased knowledge distillation strategies for recommendation systems.The existing knowledge distillation methods can amplify the bias towards popular items,thereby affecting the accuracy of recommendations for unpopular items.The proposed method aims to addres’s these issues and achieve unbiased knowledge distillation,thereby improving the user experience.
Keywords/Search Tags:Recommendation, Knowledge Distillation, Popularity Bias
PDF Full Text Request
Related items