Font Size: a A A

Research On Optimization Of Differential Privacy Budgets In Graph Neural Networks

Posted on:2023-09-19Degree:MasterType:Thesis
Country:ChinaCandidate:W T DuFull Text:PDF
GTID:2568306902457274Subject:Cyberspace security
Abstract/Summary:PDF Full Text Request
Graph neural networks have shown excellent performance in many analysis tasks of graph-structured data.However,in the real world,graph-structured data often originate from the collection of information from social groups,which may involve sensitive information related to individuals,and thus carry the risk of privacy leakage.In recent years,the research of graph neural networks with privacy-preserving properties has received increasing attention.Differential privacy is a highly regarded privacy-preserving technique,and there are also research works that combine graph neural networks with differential privacy.These works try to protect sensitive node feature information or edge information in graph-structured data.However,the privacy protection of graphstructured data needs to achieve not only data security but also the best possible data availability,i.e.,the performance of graph neural network models.In the scenario of data protection using differential privacy,the privacy budgets will directly affect the amount of noise added to the data,thus directly determining the usability and security of the data.Although the subsequent graph neural network models are considered to be theoretically immune to privacy attacks under the protection of differential privacy techniques,the connection between privacy budgets and the degree of privacy protection that the graph-structured data possesses in practice is unclear.There is no research that provides an optimal method for differential privacy budgets in graph neural networks.We propose a research scheme for the problem of optimizing differential privacy budgets in graph neural networks to establish an intuitive connection between the privacy budgets and the degree of privacy protection that graph-structured data possess in practice,and to provide an optimal selection method for differential privacy budgets in graph neural networks.Specifically,we use the attack effect of potential attacks in practice as a metric of the degree of privacy protection.We investigate the optimizing problem from three perspectives:node characteristics,edges,and the robustness of the graph neural network model,aiming to establish an intuitive connection between the privacy budgets,data availability,and the degree of privacy protection in practice.Firstly,we propose an optimization scheme for differential privacy budgets at the node feature level in graph neural networks.The classification accuracy of the graph auto-encoder model on the edge prediction task is used to measure data availability,the attack effect of the attribute inference attack is used as a metric of the degree of privacy protection of node features,and the server is assumed to be an adversary to perform the attribute inference attack on the graph neural network model under different privacy budgets.Secondly,we propose an optimization scheme for differential privacy budgets at the edge level in graph neural networks.A node classification model based on twolayer graph convolution network is trained using generated graph-structured data,the accuracy of the model on the node classification task is used to measure data availability,and the attack effect of the link-stealing attack is used as an intuitive indicator of the degree of edge privacy protection of graph-structured data.We perform the link-stealing attack on the node classification model under different privacy budgets.Finally,we propose a differential privacy budgets optimization scheme at the level of the graph neural network model’s robustness.We find that the differential privacy budget in graph neural networks not only affects the data availability and the degree of privacy protection,but also the robustness of the graph neural network model.We train a graph classification model,use the classification accuracy of the model on the graph classification task to measure data availability,use the attack effect of the backdoor attack in graph neural networks as an intuitive metric of the robustness of the graph neural network model,and launch the backdoor attack on the graph classification model under different privacy budgets.Experiments on publicly available graph-structured datasets show that we provided a valuable solution to the privacy budget optimization problem in graph neural networks,which is expected to be useful for the application of differential privacy in practice.
Keywords/Search Tags:Graph Neural Networks, Differential Privacy Budgets, Backdoor Attack, Link Stealing Attack, Attribute Inference Attack
PDF Full Text Request
Related items