Font Size: a A A

Differential Privacy Inference Attack Based On Location Density And Distance Features

Posted on:2020-05-04Degree:MasterType:Thesis
Country:ChinaCandidate:R J WangFull Text:PDF
GTID:2428330599958603Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the development of information technology and the application of various location-based mobile devices and services,the mobile data containing personal information is exploding.On the one hand,the mobile data can be used as commercial research to provide more accurate services for users.On the other hand,the original mobile data not only contains a large number of personal attributes,but also can be used to mine users' travel trajectory and social relationships,which will lead to privacy leakage.However,differential privacy is widely used in the existing privacy protection methods,but there are few researches on inference attack.Through reasoning attack,we can find out the privacy disclosure scenario of differential privacy,which is of practical significance to the improvement of differential privacy protection methods in the future.In view of the two problems of strong independence assumption of differential privacy and the fact that the data after differential privacy processing can still retain the user's location distribution characteristics within a certain perturbation range,the definition of "real data" and "false data" after perturbation are first given.Given a threshold value,if the distance from the disturbed position to the original position is within the threshold value,the position is a real position;otherwise,it is a false position.Based on this concept,an attack scenario of inferential differential privacy is designed.It is assumed that the attacker has some real data as background knowledge and uses the background knowledge to distinguish the true and false data.As a result,a set of differential privacy inference attack method(DPIA-LDDF,Differential privacy inference attack based on location density and distance features)was proposed.Taking each disturbed data as a sample,extracting the relevant location density and distance characteristics from multiple records,and labeling each sample with positive and negative labels according to the authenticity of data.Taking the sample set of the perturbed data set corresponding to the background knowledge as the training set,a decision tree-based lifting algorithm model is trained.And then the rest of the sample set to be inferred will be put into the model,the label of each sample can be predicted,and the authenticity of the corresponding check-in data of each sample can be obtained.The attack method is designed and implemented,including the specified algorithm to define the threshold of true and false data,the feature extraction algorithm and the integration algorithm based on the decision tree.Experiments on multiple data sets verify the validity of DPIA-LDDF method.
Keywords/Search Tags:mobile data privacy protection, differential privacy, inference attack, feature extraction, machine learning
PDF Full Text Request
Related items