Font Size: a A A

Research On The Model Checking Algorithm Using Hadoop

Posted on:2014-01-30Degree:MasterType:Thesis
Country:ChinaCandidate:G WeiFull Text:PDF
GTID:2248330395998328Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Computer hardware and software system has been widely used in business-critical and safety-critical areas. Once the systems fails, immeasurable loss would be caused. As computer hardware and software systems become increasingly complex, the research of such systems’correctness and reliability of become the hotspot in the field of computer science. Model checking is a simple and high degree of automation verification method and has become a focus of attention among various techniques.Model checking is an automatic, model-based, property-verification approach. In the model checking, the system model is expressed by Kripke structure, finite state systems or finite state transition system. Model Checking Based on Temporal Logic, including linear temporal logic and computational tree logic. Model checking is very fruitful in verifying hardware, communication protocols and software. Model checking verifies the finite state machines and temporal logic formula. When the system does not meet the logic formula. Model checking gives a counterexample which is lead to failure.The state space is searched exhaustively in model checking, and the number of variables and the size of the parallel components of the system increase exponentially. Adding a boolean variable to the model will double the complexity of verifying a property. Exponential tendency of state spaces is known as the state explosion problem and has become a bottleneck problem of model checking.Many technology exist to overcome the state space explosion problem. One of them is to use a distributed technology to store and search of the state space of the system model. Hadoop, an open source distributed data processing framework, is used to design and implement the model checking algorithm. Hadoop framework allows for the distributed processing of large data sets across clusters of computers using simple programming models. Users can easily develop and run applications processing large amounts of data, without the knowledge of distributed technology. This paper introduces the Hadoop framework, including Distributed File System (HDFS), MapReduce programming model.This paper introduces the design and implementation of the CTL model checking algorithm. Via the study on model checking and the analysis to implementation process of MapReduce, this paper describes a data representation for MapReduce programming paradigm suitable for Kripke structure. A new model checking algorithm for the temporal logic CTL based on MapReduce framework is designed and implemented. This algorithm, given a model and a CTL formula, outputs all states of the model which satisfy the given CTL formula. Experiments show that the algorithm is feasible and effective.The model checking algorithm using Hadoop provide a new approach for overcoming the state explosion problem.
Keywords/Search Tags:MapReduce, Model Checking, CTL
PDF Full Text Request
Related items