Font Size: a A A

Research On Performance Testing And Analysis Of XML Query Engines

Posted on:2009-09-19Degree:MasterType:Thesis
Country:ChinaCandidate:T XuFull Text:PDF
GTID:2178360275471879Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Testing is a vital phase to guarantee software quality in modern software lifecycle. To answer the need for performance tests of XML query engines, and to act as a watchdog in the process of development and improvement of DM database XQuery engine, proper theory and methodology must be brought forward, including usecases design, data analysis method, automated testing framework and so on to judge merits and demerits among various XML storing models, query algorithms, and also as a practical supplement to theoretically analysis, a stepping-stone to quantitative analysis.By introducing a template-base data generate method, certain features of target document tree could be specified, and thus queries in existing benchmark could be improved. By observing what impact could be imposed to the performance of different implementation strategies, more distinguishable usecases could be obtained. By concurrently execute a transaction mix which consist of a set of queries, each of a certain proportion, a better simulation of practical application than that isolated usecase in laboratory could be established, which could provide a better prediction of query performance in real world application.As test grows more and more comprehensive, eyeballing is more and more incapable facing rapidly growing data. To handle this problem, 6 statistical measures are introduced, in basis of which, indices such as medley relay speed and scalability factor are defined. Converting mass raw data into indices described above could provide a clearer view of performance of query engine and potential bottlenecks. Elaborately designed graphs and plots could yield clearer test reports. With proper usecase design and data analysis, the amount of data first increases, and then decreases, with this process of negation of negation, a better understanding is obtained.To efficiently organize and describe usecase and to perform continuable testing, An automated testing framework need to be introduced. A framework written with script language adopts the plug-in architecture, which consist of kernel interpret model and extended adapters. By taking the advantage of script language, platform-related details could be screened and multi-platform modeling could be achieved. By using text console as a interact interface and make use of XML to describe a test plan, extensibility is achieved to the fullest extent. New engines could be easily connected in, and existing engines can modify their interfaces with little changing in the adapter. All the features of the test framework that described above could lay a solid foundation for further research.
Keywords/Search Tags:database, automated test, benchmark, XML, query engine
PDF Full Text Request
Related items