Font Size: a A A

Multi-threaded Caching Problem

Posted on:2012-03-06Degree:DoctorType:Dissertation
Country:ChinaCandidate:H F XuFull Text:PDF
GTID:1220330395973491Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In this work, we are interested in the problem of satisfying multiple concur-rent requests submitted to a computing server. Informally, there are users each sending a sequence of tasks, which are linked by precedence constraints, to the server. Tasks may occur several times in the same sequence as well as in some request sequence from another user. The server owns a cache of limited size where intermediate results of the processing could be stored. Different task may incur different processing time, and is of different size. When serving some task, if its result has already been stored in the cache, then no cost would be incurred:other-wise, some processing cost would be incurred, and the task could be stored in the cache, which may result that some other tasks are removed from the cache so that the total size of the tasks in the cache is not greater than the cache capacity.The goal of this work is to determine a schedule of the tasks such that cer-tain optimization functions are minimized. This problem is a variant of Caching in which only one sequence of requests is considered. We then extend the study to the minimization of the mean completion time of the request sequences. Two models are considered. In the first model, caching is forced whereas in the second model caching is optional and one can choose whether an intermediate result is stored in the cache or not. All combinations turn out to be NP-hard even for fixed cache sizes and we provide dynamic programming as well as bounds for inapproxi-mation. We propose polynomial time approximation algorithms for some variants and analyze their performance ratios. Finally, we also devise some heuristics and present experimental results.In fact, from the perspective of resource utilization, cache is also a resource beyond all doubt.Thus, in addition to the study of the cache utilization, we address a variant of scheduling problem on two identical machines as well, where an additional speed- up resource is given. If a job uses the resource, its processing time may decrease. However, at any time the resource can only be used by at most one job. The objective is to minimize the makespan. For the offline version,which is NP-hard, we present an FPTAS. For the online version where jobs arrive over list, we propose an online algorithm with competitive ratio of1.781, and show a lower bound of1.686for any online algorithm.
Keywords/Search Tags:Paging, Caching, Multi-threaded Caching, Complexity, Online Algo-rithms, Scheduling, Speeding-up Resource
PDF Full Text Request
Related items