Font Size: a A A

Research On Key Technology Of Ensemble Knowledge Distillation In Federated Learning

Posted on:2023-09-06Degree:MasterType:Thesis
Country:ChinaCandidate:J X DaoFull Text:PDF
GTID:2568306617483544Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Federated learning enables distributed clients to jointly train deep learning models without sharing local real data,effectively solving the problem of data island.The existing federated learning scheme has the problems of high communication cost and large number of deployment model parameters.The distributed client and the central server need to adopt the same network structure,which can’t guarantee the heterogeneity and privacy of the local model.These situations become the main bottleneck to restrict the development of federated learning.To solve these problems,based on the ensemble knowledge distillation technology,this thesis studies how to reduce the communication cost of federated learning and the scale of federated model parameters.The main contents include the following points:1.To solve the problem of high cost of communication and model deployment difficulties,this thesis proposes an ensemble knowledge distillation method based on maximum mutual information.Multi-teacher ensemble knowledge distillation framework is used to strengthen students’ model in the target task on generalization ability,in the distillation train,maximum mutual information is introduced to calculate the loss between the model of the teachers and student model,strengthen students’ learning ability of the model of teachers,through the relation extraction experiment task,the experimental results demonstrate the performance of the proposed method..2.To solve the heterogeneity and privacy problems of distributed client model,this thesis proposes a federated learning method based on ensemble knowledge distillation.Ensemble knowledge distillation technology is introduced into federated learning framework,which allows distributed clients to select different local model structures to participate in federated training,and adopts client-server mode to conduct federated distillation on central servers.The results show that the federated learning framework can effectively improve the performance of the student model on the central server,reduce the number of deployment parameters of the student model,and reduce the cost of data transmission for each round of federated training.3.This thesis completes the engineering realization on the basis of federated learning method based on ensemble knowledge distillation.A distributed federated distillation platform was designed and implemented,and the platform provided an interface for the distributed client for federated learning and training,and users could obtain the aggregation model by simple configuration operation on the federated platform,which effectively improved the convenience of federated training.
Keywords/Search Tags:Federated learning, Knowledge distillation, Mutual information, Ensemble representation
PDF Full Text Request
Related items