Font Size: a A A

Research On Privacy Protection Of Deep Learning Application Services In Cloud Environment

Posted on:2022-12-25Degree:MasterType:Thesis
Country:ChinaCandidate:G L ZhangFull Text:PDF
GTID:2518306764467604Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
With the maturity of deep learning and its increasing number of users,it is inevitable to make deep learning as a cloud service.However,the privacy problem needs to be solved before it is put into use.In the cloud environment,users of deep learning applications want to protect their personal privacy during use,while owners of deep learning models want to protect the parameter privacy of deep learning models from being stolen by users or cloud computing providers,so as to ensure their own benefits.The existing research schemes are mainly aimed at solving the problem of users' personal privacy disclosure,but there is no good solution to the parameter privacy of deep learning model;Secondly,the existing schemes can rarely resist collusion attacks,and lack the ability to ensure the safe revocation of the deep learning model.In view of the shortcomings in the existing work,this thesis puts forward the following three schemes to make up for them.(1)The current privacy protection schemes based on secure multi-party construction either cannot protect the parameter privacy of deep learning models,or lack the ability to resist collusion attacks by participants.In view of the above problems,this thesis combines secure multi-party computation and homomorphic encryption to construct a privacy protection scheme that can simultaneously protect user data privacy and deep learning model parameter privacy.In addition,the solution can also resist attacks from users colluding with some cloud servers and attacks from deep learning application providers colluding with some cloud servers.(2)There are currently no effective measures to ensure the safe revocation of deep learning models.Cloud computing providers can maliciously copy corresponding code programs during service.After deep learning models are revoked,cloud computing providers can still directly deploy deep learning for profit,even if they cannot obtain model parameters.For this problem,this thesis combines fully homomorphic encryption and digital signatures to construct a revocable privacy protection scheme for deep learning applications.Under the premise of no collusion attack,the scheme can protect the privacy of user data and model parameters,and ensure the safe revocation of deep learning models.(3)In order to solve the problem of user privacy leakage caused by the collusion between the cloud server and the deep learning model owner,and to reduce the computing overhead on the user side so that the solution can be applied in the Internet of Things environment,this thesis combines differential privacy,fully homomorphic encryption,hybrid network and digital signature,constructs a collusion-resistant and revocable privacy-preserving scheme for deep learning applications.The client of this scheme only needs to perform the initial layer network calculation and add differential privacy noise in the plaintext environment,without any data encryption calculation,so it is very suitable for the user equipment with limited computing power in the Internet of Things.To sum up,based on protecting user data privacy,this thesis puts forward corresponding privacy protection schemes for model privacy security and model revocation security,complements the solutions in relevant research fields,and tamps the foundation for the practical use of deep learning applications in the cloud environment.
Keywords/Search Tags:Homomorphic Encryption, Digital Signature, Deep Learning, Cloud Computing, Differential Privacy
PDF Full Text Request
Related items