Font Size: a A A

A Study Of Privacy Preserving Federated Learning Based On Differential Privacy And Secure Shuffling

Posted on:2022-11-16Degree:MasterType:Thesis
Country:ChinaCandidate:H X HeFull Text:PDF
GTID:2518306773990569Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
Federated learning is a learning method that helps to solve the problem of data silos under multi-party computing,and the participants do not need to share local data to train a high-quality global model through distributed collaboration,which has become a popular research direction in industry and academia.However,a large number of studies have shown that the federated learning mechanism has many security vulnerabilities,these vulnerabilities can be exploited by both internal participants and external attackers to undermine the security of federated learning systems.Differential privacy is a current cutting-edge technology to protect the privacy security of federated learning.However,the application of differential privacy may lead to a decrease in model accuracy,affecting the convergence of the global model.How to reduce the accuracy loss and communication performance of the model while preserving local data privacy is an urgent research problem at present.In this paper,we design,implement,and evaluate a practical federated learning system that maintains the usability and communication efficiency of the model as much as possible while preserving data privacy.The main work and contributions of this paper are as follows:1.In the scenario of local differential privacy-preserving federated learning,the gradient based adaptive noise addition and the gradient based adaptive cropping methods are investigated respectively,with the aim of reducing the usability loss brought by differential privacy for the model.Among them,the gradient adaptive cropping and noise addition methods are proposed for the gradient descent process,and the privacy statistics are performed by using Moments Accountant mechanism,and the model security is proved by combining with privacy attacks.The method of adding adaptive noise based on the contribution rate of neurons uses the contribution rate as the basis for noise adaptation assignment,less noise is added to the features with higher contribution rate,which ensures better accuracy of the model.2.In this paper,we design a novel federated learning secure shuffle algorithm that combines differential privacy,sparse vector techniques,and proposes a Top-K gradient selection scheme to decouple the overhead of local differential privacy from the dimensionality of the gradient vector,saving the privacy budget.In addition,the ESA framework is introduced in the federated learning model to implement the shuffle differential privacy and combine the idea of exponential decay for dynamic sampling of the client.The double privacy amplification effect is achieved by client-side sampling and gradient shuffle to reduce the overall privacy loss of the system.
Keywords/Search Tags:Federated learning, Privacy preserving, Differential privacy, Secure shuffle
PDF Full Text Request
Related items