Font Size: a A A

Managing data quality in accounting information systems: A stochastic clearing system approach

Posted on:1993-08-12Degree:Ph.DType:Dissertation
University:The University of TennesseeCandidate:Bowen, Paul LarryFull Text:PDF
GTID:1479390014497142Subject:Business Administration
Abstract/Summary:
Empirical evidence indicates that the computerized information systems managers use to make operational, tactical, and strategic decisions contain data quality problems. Information economists have proven that, ceteris paribus, more accurate data increase the value of an information system. This dissertation examines the effects on information systems of (1) improving input control effectiveness and (2) increasing the frequency that organizations identify, investigate, and correct data errors.;When errors in an accounting information system accumulate to the maximum allowable error level, known as the clearing level, they are identified, investigated, and corrected, i.e., the errors are cleared. The number of errors in the system is the current error level. This set of assumptions can be modeled as a Markov process with an embedded Markov chain. Each event affecting an information system is assumed to have a probability of being processed correctly that is independent of previous error states, i.e., the probability an event is processed correctly depends only on the number of errors currently in the database.;The Markov model is used to prove four theorems that reflect commonly held assumptions. The first two theorems show that, if input control effectiveness remains constant, lowering the clearing level improves data accuracy but increases the frequency of clearings. Theorems 3 and 4 show that, for a given clearing level, improving input control effectiveness retards the accumulation of data errors and decreases the frequency of clearings.;The Markov model is also used to prove four additional theorems that reveal less obvious relationships. Theorem 5 reveals that, for a given clearing level, improving input control effectiveness increases the variability of the time between clearings. Theorem 6 shows that, if the probability of correctly processing an event is independent of the current error level, improving input control effectiveness without lowering the clearing level does not improve average data quality. Theorem 7 demonstrates that, if the probability of correctly processing an event is independent of the current error level, lowering the clearing level yields linear marginal decreases in the average proportion of errors. Theorem 8 states that, if the clearing level remains constant, improving input control effectiveness yields increasing marginal increases in the length of time between clearings.
Keywords/Search Tags:Clearing, Improving input control effectiveness, Data, Information
Related items