Font Size: a A A

Research On Personalized Data Masking Protection Method In Data Sharing

Posted on:2021-01-18Degree:MasterType:Thesis
Country:ChinaCandidate:J W SunFull Text:PDF
GTID:2428330647955085Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
With the advent of the massive data era,a large amount of personal data is collected and released every day.Publishing and sharing data has been promoted the development of scientific research,brought convenience to people's lives,and also produced a lot of private data.With the improvement of privacy concepts,users will worry about their private information leaked when using network services.It will inevitably bring corresponding privacy problems to individuals if these data and information are not processed properly.Therefore,data desensitization technology is of significance to the protection of privacy information.The existing data masking technology mainly includes data distortion,data encryption,and data anonymity.Data anonymity technology not only protects the message of users but also ensures the availability of data.However,people have different needs for the protection of sensitive information and hope to have a more desensitization method that is in line with their own.Therefore,this paper studies the personalized data desensitization protection technology in data sharing.The main tasks are as follows:(1)This paper proposes a?_kpersonalized data masking protection(PDMP)method.To solve the problems of low utilization value of direct masking desensitization method,and the basic anonymity is not complete desensitization of sensitive attributes.Firstly,the data sets are processed by the k-anonymous model;secondly,the quasi-identifiers of sensitive data is processed by?_knormalization function in the equivalence group.Masking and protecting the data by corresponding constraint methods.The results show that?_kpersonalized data masking protection method can meet the requirements of individual protection of sensitive data,reduce the sensitivity of the information,and enhance the control requirements of individual users on their sensitive value,and have stronger privacy protection ability.(2)This paper proposes an?_ppropose a probabilistic differential data protection masking method(PDDM)method.Aiming at the problem of similarity attack between sensitive attributes in personalized(p,k)anonymous protection algorithm model and personalized(?,k)-anonymous model protection method,the following improvements are made.Firstly,statistics on the probability of occurrence of each sensitive data;Secondly,use the?_pprobability function to calculate the diversity;add Laplace noise to the probability of sensitive attributes to change the frequencies,so as to achieve equivalent groups minor changes in sensitive properties and guarantee a high degree of similarity in sensitive values.The results show that the?_pprobability differential data protection masking method reduces the risk of inferring sensitive attributes from multiple quasi-identifiers,avoids the single sensitive attributes,and effectively improves the availability of privacy data.
Keywords/Search Tags:Data masking, Differential privacy, Individuation, Anonymity
PDF Full Text Request
Related items