Font Size: a A A

Entropy and software systems: Towards an information-theoretic foundation of software testing

Posted on:2012-07-22Degree:Ph.DType:Dissertation
University:Washington State UniversityCandidate:Yang, LinminFull Text:PDF
GTID:1468390011466378Subject:Computer Science
Abstract/Summary:
In this dissertation, we introduce an information-theoretic framework for software testing and integrate information theory into it. We first propose a syntax-independent coverage criterion for software testing. We use entropy in information theory to measure the amount of uncertainty in a software system, and we show how the amount of uncertainty decreases when we test the system. We model the system under test as a random variable, whose sample space consists of all possible behavior sets over a known interface. The entropy of the system is measured as the Shannon entropy of the random variable. In our criterion, the coverage of a test set is measured as the expected amount of entropy decrease, or the expected amount of information gained. Since our criterion is syntax-independent, we study the notion of information-optimal software testing where a test set is selected to gain the most information. Furthermore, we introduce the behaviorial complexity as a novel complexity metric for labeled graphs (which can be interpreted as control flow graphs, design specifications, etc.), which is also based on information theory. We also study how the behavioral complexity changes when graphs are composed together, and we show that behavioral complexity can increase when units are sequentially composed through loops, or composed in parallel through synchronization. Our results can be very helpful for software developers: they can use the metric to predict the complexity of the semantics of a software system to be built from their designs. Our results also suggest that integration testing is necessary after units are tested.
Keywords/Search Tags:Software, Testing, Information, System, Entropy
Related items