Font Size: a A A

Research On Federated Learning Security Algorithm Under Byzantine Attack

Posted on:2023-04-16Degree:MasterType:Thesis
Country:ChinaCandidate:L FangFull Text:PDF
GTID:2558307103466974Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
With the advent of the era of artificial intelligence,machine learning technology driven by big data is widely used in various fields.However,the development of machine learning technology still faces two major challenges which are data islands and data privacy security.In order to solve the above issues,Federated Learning emerged as a new distributed machine learning paradigm.In the existing Federated Learning research,it is often assumed that the parties involved in the learning are secure and credible,but this assumption is difficult to satisfy in practical application scenarios.When the client participating in the learning is maliciously attacked and held hostage,or the device itself fails,incorrect parameter information will be sent to the central server.The central server aggregates these parameters,resulting in model training deviation,failure and even privacy leakage.In related research,the client that sends the wrong information to the central server is called Byzantine client,and this abnormal behavior of the client is called Byzantine attack.This dissertation mainly studies the optimization of the gradient robust aggregation algorithm of the central server when there is a Byzantine attack in the Federated Learning scenario.Aiming at the Byzantine attack problem under the Federated Learning framework,this dissertation proposes a robust optimization algorithm based on density.Firstly,considering the classical Byzantine attack,a gradient local density-based Federated Learning security algorithm(GDB-FLS)is proposed.The key of the algorithm is to calculate the local density of each client’s gradient as the basis for judging whether the client is a Byzantine client,and then design the corresponding weights based on the local density of the security gradient for weighted aggregation.Secondly,this dissertation extends the classical Byzantine attack to the generalized Byzantine attack and optimizes the GDB-FLS algorithm.By calculating the local density of each dimension element of the gradient and detecting Byzantine information,a novel Federated Learning security algorithm based on the local density of gradient elements(GEDB-FLS)is proposed.It enables the central server to resist more covert generalized Byzantine attack and realize robust aggregation of gradients.Finally,considering the influence of hyperparameter kernel width on the performance of the proposed algorithm and resource utilization,an adaptive kernel width selection strategy is proposed in this dissertation.This strategy sorts the distance between gradients twice,and adaptively selects a reliable gradient distance as the kernel width reference value in each iteration,which solves the problem of kernel width debugging.In this dissertation,comprehensive simulation experiments are carried out to verify the performance of the proposed algorithm on synthetic and real datasets.By simulating the Federated Learning environment under various Byzantine attacks,it is verified that the proposed GDB-FLS and GEDB-FLS algorithms have advantages compared with the mainstream Byzantine robust aggregation algorithms and improve the accuracy of the training model.At the same time,the experiment proves that regardless of whether the client data is stored in the form of independently identical distribution or non-independently identical distribution,the GDB-FLS and GEDB-FLS algorithms can effectively resist Byzantine attacks and ensure the convergence of the model.Finally,it is proved by experiments that the proposed adaptive kernel width selection strategy can solve the kernel width debugging problem,effectively reduce the debugging time of the algorithms,and improve the resource utilization.
Keywords/Search Tags:Federated Learning, Byzantine attack, security algorithm, robust gradient aggregation, adaptive kernel width
PDF Full Text Request
Related items