Font Size: a A A

Understanding Shared Memory Dependences In Concurrent Programs

Posted on:2018-03-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y Y JiangFull Text:PDF
GTID:1368330545477609Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Concurrent programming has been popular since the widespread of multi-core processors and increased support in programming languages.However,the non-deterministic nature of concurrent programs brings new challenges:concur-rency bugs are notoriously difficult to trigger,reproduce,and detect.Improving the quality of concurrent programs has become a major challenge.This paper focuses on the dynamic analysis of concurrent programs in which a concurrent program's execution is observed and analyzed.Dynamic analysis is one of the most effective and efficient techniques towards quality concurrent programs because it strikes a good balance point between efficiency and com-pleteness.Since dynamic analyses are based on execution traces and concurrent programs have data/resource dependences across threads,the characteristics of such dependences must be understood to realize effective and efficient dynamic analyses of concurrent programs.This paper particularly focuses on a common problem faced by all dynamic analyses of concurrent programs:obtaining shared memory dependences,the order of shared memory accesses.Since the non-determinism of a concurrent pro-gram is mainly attributed to non-deterministic accesses to shared resources(in particular,the shared memory),the dependences(ordering)between these ac-cesses must be faithfully logged and analyzed.However,due to the huge amount of shared memory accesses,how to effectively and efficiently obtain such depen-dences is a central problem that serves as the cornerstone of dynamic analyses of concurrent programs.This paper studies three aspects of shared memory dependences:the research framework,obtaining shared memory dependences,and dynamic analyses based on shared memory dependences:1.Realizing that there lacks a systematic understanding of shared memory de-pendences,we first proposed a theoretical framework of modeling shared memory dependences and a technical framework of studying related work.The theoretical framework contains the formalization of concurrent system,shared memory dependences,and their basic properties.Furthermore,it also includes the theory of locality in which characteristics of shared memory ac-cesses in real-world concurrent program executions are studied.The techni-cal framework,under which we surveyed existing work across several research domains,contains three key elements of shared memory dependences:the criteria of measuring the effectiveness and efficiency of a technique that ob-tains them;two categories of approaches to obtaining them;and dynamic analysis techniques based on them.2.Based on these two frameworks,we developed two novel techniques that can effectively and efficiently capture shared memory dependences.Optimistic shared memory dependence tracing technique RWTrace exploits thread lo-cality of concurrent programs.It can detect thread-local reads in O(1)time without synchronization.By not keeping log for such events,it achieved hundreds of times of speed-up compared with a mutex-based implementa-tion;Online shared memory dependence reduction via bisectional coordina-tion exploits spatial-thread locality of concurrent programs.It maintains an interval partition of the address space such that variables in each interval exhibit spatial-thread locality,yielding up to 97%reduced shared memory dependences by paying only 0-54.7%of runtime overhead upon the highly optimized RWTrace.3.We also studied how to use shared memory dependences in realizing software testing and debugging techniques.Deterministic replay technique CARE leverages the value prediction cache to guide an efficient record-replay of concurrent Java programs.Dynamic analyses(data race detection and false sharing detection)based on bisectional coordination can greatly reduce run-time overhead and metadata maintenance cost.Software testing techniques based on reversing dependences found previously unknown bugs in popular open-source software.Finally,summarizing the past research,we found that nobody-ever achieved a precise shared memory dependence tracing using only wait-free changes to the program or concurrent system.Therefore,this paper also raises a negative con-jecture:there is no free(wait-free modification)lunch(sequentially consistent trace).We provide a proof to a more restricted case of the VSC(verifying se-quential consistency)problem,and use a similar technique to prove a special case of the no-free-lunch conjecture.
Keywords/Search Tags:Shared Memory Dependence, Concurrency, Concurrent Program, Dynamic Analysis
PDF Full Text Request
Related items