Font Size: a A A

Research On Cloud Computing Forensic Model And Its Key Technology

Posted on:2018-08-25Degree:MasterType:Thesis
Country:ChinaCandidate:Y Z GaoFull Text:PDF
GTID:2348330563451197Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the rapid development of cloud computing in the world,it has become a new target of digital crime.External attacks and malicious insiders are increasingly serious.Cloud computing forensics is one of the effective measures to combat and deter cloud crimes.However,the inherent characteristics of cloud computing(such as large-scale,virtualization,distribution,etc)lead to large-amount possible evidence and the difficulty in precise and complete evidence location.Current forensic models and exsiting forensic methods on evidence collection and analysis are insufficient in the cloud environment.In this paper,several key technologies of cloud forensics were deeply researched.The main work is as follows:1.In order to guide the forensic process in the cloud computing environment,aiming at the main challenges facing cloud forensics,cloud characteristics were analyzed and a cloud forensic model was proposed.In view of the volatility of data in the cloud environment,the deployment of forensic readiness service for real-time backup of distributed cloud storage metadata and virtual machine management data was presented.Analyzing the large-scale and multi-layer feature of cloud forensic environment and the diversity of attack modes,multi-angle evidence identification strategies,multi-round identification process based on iteration and multi-level evidence location method for distributed file system were presented to enhance the completeness and accuracy of evidence identification.In view of the openness and sharing of cloud storage,“data isolation” and “on-demand collection” strategies were presented to prevent evidence destruction or collecting too much irrelevant data.For the large amount and diverse formats of data to be analyzed,the establishment of Hadoop framework and comprehensive forensic tool library based on cloud resources was presented.In addition,the role of loss party representative was put forward to protect the evidence integrity and ensure the establishment of chain of custody,combining with the principles of digital signature.Finally,the validity of the model was analyzed combining the forensic scene in the cloud environment.2.Aiming at the difficulties of evidence identification and collection in distributed cloud storage,taking HDFS,a popular distributed file system,as a case study,a forensic method of efficient file extraction in HDFS based on three-level(3L)mapping was proposed.Via analyzing the overall structure and metadata features of HDFS and the local file system in which HDFS locates,HDFS was divided into four layers from its namespace to local storage space,and then the 3L mapping of a HDFS file to its local data blocks was formally designed and established.Analyzing the metadata changes of all layers after a HDFS file is deleted,a file recovery method based on 3L mapping was presented.A multi-node HDFS architecture via Xen virtualization platform was set up to test the performance of the proposed method.The results show that the method could precisely and completely locate the local block addresses of the HDFS file to make selective image of disk data and improve evidence collection efficiency.The recovery rate of deleted files(especially for large files stored across data nodes)is far higher than that of DFF and Explorer when the files are partially overwritten.3.Aiming at the problems of big data analysis and insider theft detection in the data theft detection of distributed cloud storage,a stochastic forensic algorithm of HDFS data theft detection based on MapReduce was proposed.By analyzing the theoretic basis of file system behavior stochastic model and the HDFS timestamp features generated by folder replication,the replication behavior's detection and measurement method which could detect the data theft launched by malicious insiders with legal authority was established.Analyzing the parallelized data processing features of MapReduce,the data set which is suitable for MapReduce task partition and maintains the HDFS hierarchy,and the algorithm implementation process which takes file as data unit and takes folder as detection unit were designed.The experimental results show that adopting segment detection strategy according to the number of files contained in a folder,and adjusting detection shrehold properly,the missed rate and the number of false detection could be kept at a low level.The algorithm execution efficiency of multi nodes relative to single node increases with the increase of data volume.When the data volume reaches 7.44 GB,the execution efficiency of 8 nodes is 7.39 times of single node.The algorithm has good scalability.
Keywords/Search Tags:Cloud Computing Forensics, Forensic Model, HDFS, Evidence Collection, Three-Level Mapping, Data Theft Detection, MapReduce
PDF Full Text Request
Related items