Font Size: a A A

Research On Privacy Disclosure And Risk Assessment Technology For Model With Differential Privacy

Posted on:2022-03-13Degree:MasterType:Thesis
Country:ChinaCandidate:G X HuangFull Text:PDF
GTID:2518306572451024Subject:Cyberspace security
Abstract/Summary:PDF Full Text Request
Differential privacy is an important means to protect privacy.Nowadays,many deep learning systems and platforms take differential privacy as an important guarantee to protect users' privacy.At the same time,many third-party service providers have other data sources in the same data domain and integrate these data to train deep learning models,so as to obtain better pre-test results.In order to protect data privacy,they may use differential privacy on the deep learning model.How to conduct risk audit assessment for third-party models with differential privacy protection without contacting the training data used by third-party models has become a problem worth exploring.In this paper,firstly,the application scenario of the combination of differential privacy and deep learning model is analyzed in depth,and the privacy leakage risk and actual application scenario of the model under differential privacy protection are discussed.Aiming at the problem of data sampling in risk audit evaluation of differential hidden-private model,a data augmentation algorithm based on image local statistical information is implemented,and the generated data satisfies the same data field as the actual data.By analyzing the hardware resources required by the deep learning model of differential privacy protection,from the perspective of reliability of automatic risk assessment scheduling based on containers,unified management and scheduling of resources based on the original container environment are realized,and a set of corresponding container usage rule file templates is designed,which can automatically perform risk assessment operation by configuring the rule file.At the same time,the risk audit needs to determine whether the user data is used by the third-party service provider without authorization without obtaining the data set of the third-party service provider.In this scenario,this paper proposes a Kolmogorov Smirnov distance audit method,which determines whether the privacy data is used without authorization by comparing the cumulative distribution functions output by different differential privacy models.Finally,the data sampling effect of risk assessment method,automatic arrangement of containers,and risk audit assessment method are experimented.The experimental results show that in the deep learning model for differential privacy protection,the sampling effect of the data sampling augmentation method in this paper meets the needs of the risk assessment audit method;Construct two in-depth learning models to meet the needs of audit evaluation and improve the performance of risk assessment;The proposed audit method based on Kolmogorov Smirnov distance has the ability to conduct risk assessment for the model of differential privacy protection.
Keywords/Search Tags:Differential Privacy, Deep Learning, Audit Method, Kolmogorov-Smirnov Distance
PDF Full Text Request
Related items