The development of big data and AI technology faces the dilemma of data silos,and the federation learning models proposed in this context still have the risk of privacy leakage,while the introduction of differential privacy mechanism inevitably degrades the model performance while improving security.Therefore,it is worth investigating how to maintain better model performance in scenarios where differential privacy techniques are applied to federation learning.The current research on this problem faces two challenges,namely,the waste of privacy budget in the differential privacy budget allocation process and the inconsistent contribution to the global model due to the differences of training participants in the federation parameter aggregation process.To address the above two problems,this paper proposes FAUST,a federated aggregated data protection algorithm based on differential privacy budget dynamic allocation and user trusted values,and provides a detailed description of the design of the FAUST algorithm,which includes the differential privacy budget dynamic allocation algorithm DP-DBA,the federated weight aggregation algorithm FedWgt and the user trusted value update algorithm TRUE.DP-DBA algorithm From the perspective of privacy protection security,the DP-DBA algorithm takes into account the remaining budget of the user and the difference between the local model and the global model in the iterative process,and achieves the dynamic allocation of privacy budget in the process of differential privacy protection;The FedWgt algorithm achieves the optimization of the federation aggregation algorithm from the perspective of the accuracy of the federation learning model based on the different contributions of users to the model;the TRUE algorithm achieves the update of user trustworthiness values based on the change of user contributions during the training process of the model.The final results show that the FAUST algorithm can avoid the waste of privacy budget compared with the original NbAFL algorithm and Lap algorithm,and it can show higher model accuracy with the same privacy budget ε.This paper designs and implements a data protection system based on the FAUST algorithm,FDPS,following software engineering development ideas and methods,and describes in detail the design and implementation process from requirements analysis,outline design,detailed design and system testing.The system test results show that all functional modules of FDPS are running normally.On the client side,users can set the level of privacy protection and model parameters according to their needs,and can view the model training results at any time,which increases user interactivity and improves user trust.The server side allows administrators to monitor client status and view model training results in real time,helping them to diagnose model anomalies and improve model performance. |