Font Size: a A A

An Effective Fog Computing-based Caching Framework For Latency Minimization

Posted on:2021-04-11Degree:MasterType:Thesis
Country:ChinaCandidate:Pitah Djahoue Kevan AymericFull Text:PDF
GTID:2428330611970457Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the rapid proliferation of new technologies,predictions show that there will be around 50 billion smart sensors,Internet of things(Io T)and edgedevices deployed and connected around the world in 2020 according to Cisco Systems.Considering the rise of lo T and social media,the major traffic flows in the future wireless network is not conventional phone traffic anymore.Such increasing traffic volumes require better performance in terms of latency and therefore demand different approaches to resource utilization.Of course,this fast development brings several new issues and challenges to our classic cloud computing environment such as low capacity,high latency,security issues,network delays etc.and to address those issues,the Fog Computing paradigm has been introduced to be a promising solution.The main idea of fog computing is to be a decentralized extension of the cloud and act as a middle layer to Io Ts and end-devices.In our paper we discuss and highlight theproblem of high latency that the cloud brings.In the next section,we mention and describe some of the most relevant work related to our topic by other authors.Furthermore,to tackle the problem of latency,our main contribution is a hierarchicalfogcomputing-basedcaching framework that results in low latency transmission of data.It works by deploying multiple fog nodes that are geographically close to the usersso they provide a low-cost decentralized and distributed caching storage,with minimal computation and communication framework that complements existing infrastructure and offloads network traffic.The system accepts requests from the users and processes them if possible,if not it is forwarded to the cloud.We will do so with the help of the IFogsim toolkit library to act as a test-bed for our experimentations.Our initial findings shows some satisfying results that allows low-latency communicationbecause of the closeness of Fog nodes to the on-premise end-point devices,resulting in a much faster response time and analysisthan the cloud approach,then the latter findings in the second experiment shows a minimized cost of execution due to a more distributed architecture.The result of this experiment can serve as a basis in future research related with Fog Computing,and can be used for Quality of Service(Qo S)benchmarking for Io T applications.Finally,in the last chapter we come to a conclusion,discuss the limitations of the system model as well as future research.
Keywords/Search Tags:Internet of things(IoT), Fog computing, latency, Quality of service(QoS)
PDF Full Text Request
Related items