Font Size: a A A

Modeling And Optimization For Energy Cost Of Data Center In Cloud Computing

Posted on:2016-08-22Degree:DoctorType:Dissertation
Country:ChinaCandidate:S B ZhangFull Text:PDF
GTID:1228330470957961Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Although the concept of cloud computing appeared only a few years ago, it has gained a rapid development now. Cloud computing has emerged as a new paradigm in the field of network-based services within many industrial and application domains, brought them a disruptive innovation, and shown great value and prospect. With the increasing online applications, services and data, cloud computing has been growing rapidly. In order to meet the increasing demands of the computing, storage and network resources, there is a trend toward the large-scale data center. As data centers are widely deployed and their scale increases significantly, a large amount of servers can lead to a sharp increase in energy consumption associated with these data centers, and the huge energy costs of large data centers are becoming the faster-growing element in their op-eration, and increasingly restrict the developments of many enterprises. Therefore, it is particularly important for cloud services providers to improve the market competi-tiveness that keeping data center reliable and safe while minimizing energy costs and reducing operational costs. In recent years, the efficient energy cost management of da-ta centers is becoming an urgent and important research field which attracts attentions both from academy and industry.This paper considers the problem of energy cost control for large-scale data centers in cloud computing, gives a profound analysis of the energy saving mechanisms, and derives optimal control policy to achieve the energy cost savings using the learning the-ory as well as the large deviation principle. The main work of this paper is summarized as follows:1) Firstly, we investigated the problem of electricity cost minimization of data cen-ters using energy storage for time-varying electricity prices under deregulated electricity markets, which was formulated as a discounted cost Markov decision process. A dynamic energy storage control strategy based on the Q-Learning algorithm was designed to reduce the electricity cost, and we also applied the Speedy Q-Learning algorithm in order to accelerate convergence. The advantage of the proposed scheme is that it makes decision without any priori information about the energy management system of the data centers, and it can also adapt to the variations of the workload and the electricity prices. We also studied the offline optimization problem which was characterized as an MILP problem, and its optimal solution can be considered as a lower bound on the performance of the proposed algorithm. In the experiments, real workload traces and electricity price data sets were used for verifying the performance of the proposed scheme. The results illustrated the effectiveness of the proposed scheme in saving the elec-tricity cost via comparison with the benchmark algorithm. Results for the real traces that may not provably follow the Markovian assumption also show that the proposed scheme generally performs well.2) Considering the problem of electricity cost saving subject to a constraint on the queue delay for the data center into tailoring active servers to adapt to the work-loads and electricity prices under the deregulated electricity markets, which was formulated as a constrained cost Markov decision process. Considering various assumptions regarding the information available about the underlying stochas-tic processes, online, learning theoretic and offline optimization approaches have been studied. Conventional learning theoretic (like Q-Learning) makes decision without any prior information on the Markov process. However, it suffers from the slow-convergence problem. In order to address this problem, the workload distribution is estimated by GMM, and then the PDS based learning algorithm is introduced to accelerate the convergence. We also studied the offline optimiza-tion problem, and its optimal solution can be considered as a lower bound on the performance of the proposed schemes. Real workloads and electricity prices were used for verifying the performance of the proposed schemes. The performance evaluation demonstrates the effectiveness of the proposed schemes.3) Finally, we introduced renewable energy to the data centers in cloud computing, such as photovoltaic (PV), wind generations (WG), etc, in order to achieve the green data center. We interpreted the problem of energy saving for the data center into tailoring active servers to adapt to the workloads and the renewable energy while providing QoS guarantees, which was formulated as a constrained opti-mization problem. An online measurement-based algorithm for adaptive server resource configuration was proposed based on an overload probability estimation model. The proposed algorithm makes decision based on monitoring the curren-t workload and the renewable energy without any prior knowledge about them. Meanwhile, in order to obtain a smooth server resource configuration, the pro-posed algorithm applies iterative step to adjust the number of the active servers, and it is useful in scenario of nonstationary workloads. Finally, we designed and implemented the experiments based on real workloads from Thunder cluster and Intrepid cluster and the renewable energy from50Hertz for verifying the perfor-mance of the proposed algorithm. The experiment results showed its performance guarantees and the adaptability to the nonstationary workloads and the renewable energy.
Keywords/Search Tags:Cloud Computing, Data Centers, Energy Cost Control, Renewable Energy, Green Data Center, Large Deviation Principle, Stochastic Optimization, ReinforcementLearning
PDF Full Text Request
Related items