A Case for Elevating the Edge to be a Peer of the Cloud (original) (raw)

Elevating the Edge to Be a Peer of the Cloud

2019 IEEE 12th International Conference on Cloud Computing (CLOUD)

Enabling next generation technologies such as self-driving cars or smart cities requires us to rethink the way we support their applications. The emergence of these technologies is fueled by the proliferation of a large number of devices in the Internet of Things. These devices have the potential to generate massive amounts of data, and applications supporting them often require this data to be processed in a timely manner. Because of these requirements, we must augment and extend the Cloud computing model to better serve such applications. The backhaul links connecting clients to Cloud data centers could quickly become overwhelmed by such data, and the physical distance of these data centers from clients prevents low-latency response times. To meet the challenges posed by emerging IoT applications, we must provide Cloudlike functionality closer to the edge of the network, where clients and their data live. We propose to elevate the Edge to be a peer of the Cloud for addressing these challenges.

Mobile edge cloud architecture for future low-latency applications

2020

OF THE DISSERTATION Mobile Edge Cloud Architecture for Future Low-latency Applications by Sumit Maheshwari Dissertation Director: Dipankar Raychaudhuri This thesis presents the architecture, design, and evaluation of the mobile edge cloud (MEC) system aimed at supporting future low-latency applications. Mobile edge clouds have emerged as a solution for providing low latency services in future generations (5G and beyond) of mobile networks, which are expected to support a variety of realtime applications such as AR/VR (Augmented/Virtual Reality), autonomous vehicles and robotics. Conventional cloud computing implemented at distant large-scale data centers incurs irreducible propagation delays of the order of 50-100ms or more that may be acceptable for current applications but may not be able to support emerging real-time needs. Edge clouds considered here promise to meet the stringent latency requirements of emerging classes of real-time applications by bringing compute, storage, and...

Why cloud applications are not ready for the edge (yet)

Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, 2019

Mobile Edge Clouds (MECs) are distributed platforms in which distant data-centers are complemented with computing and storage capacity located at the edge of the network. Their wide resource distribution enables MECs to ful ll the need of low latency and high bandwidth to o⇥er an improved user experience. As modern cloud applications are increasingly architected as collections of small, independently deployable services, they can be ⇤exibly deployed in various con gurations that combines resources from both centralized datacenters and edge locations. In principle, such applications should therefore be well-placed to exploit the advantages of MECs so as to reduce service response times. In this paper, we quantify the bene ts of deploying such cloud micro-service applications on MECs. Using two popular benchmarks, we show that, against conventional wisdom, end-to-end latency does not improve signi cantly even when most application services are deployed in the edge location. We developed a proler to better understand this phenomenon, allowing us to develop recommendations for adapting applications to MECs. Further, by quantifying the gains of those recommendations, we show that the performance of an application can be made to reach the ideal scenario, in which the latency between an edge datacenter and a remote datacenter has no impact on the application performance. This work thus presents ways of adapting cloud-native applications to take advantage of MECs and provides guidance for developing MEC-native applications. We believe that both these elements are necessary to drive MEC adoption. CCS CONCEPTS • Networks ⌅ Network measurement; • Computer systems organization ⌅ Cloud computing; • Software and its engineering ⌅ Software design engineering.

Impact of Mobile Edge Computing in Real-World

Today's world is seeing an increasing usage of mobile devices and sensor rich devices such as smartphones, tablets and wearable devices such as smart watches. The volume of data that is generated by these devices is huge. Centralized cloud computing architectures cannot address the problems of network latency and jitter, degrading QoS (Quality of Service) and QoE (Quality of Experience) and other challenges in the world of mobile users. In this paper, we survey the impact of Mobile Edge Computing in the real world, an emerging edge computing technology to bring the cloud computing paradigm beyond the centralized architecture towards the edge of the network, nearer to the devices.

IJERT-An Overview of Edge Computing

International Journal of Engineering Research and Technology (IJERT), 2019

https://www.ijert.org/an-overview-of-edge-computing https://www.ijert.org/research/an-overview-of-edge-computing-IJERTCONV7IS05016.pdf Cloud Computing has revolutionized how people store and use their data. However, there are some areas where cloud is limited; latency, bandwidth, security and a lack of offline access can be problematic. To solve this problem, users need robust, secure and intelligent on-premise infrastructure of edge computing. When data is physically located closer to the users who connect to it, information can be shared quickly, securely, and without latency. In financial services, gaming, health care and retail, low levels of latency are vital for a great digital expertise. To improve reliability and faster response time, combine cloud with edge infrastructure.

Towards Enabling Novel Edge-Enabled Applications

ArXiv, 2018

Edge computing has emerged as a distributed computing paradigm to overcome practical scalability limits of cloud computing. The main principle of edge computing is to leverage on computational resources outside of the cloud for performing computations closer to data sources, avoiding unnecessary data transfers to the cloud and enabling faster responses for clients. While this paradigm has been successfully employed to improve response times in some contexts, mostly by having clients perform pre-processing and/or filtering of data, or by leveraging on distributed caching infrastructures, we argue that the combination of edge and cloud computing has the potential to enable novel applications. However, to do so, some significant research challenges have to be tackled by the computer science community. In this paper, we discuss different edge resources and their potential use, motivated by envisioned use cases. We then discuss concrete research challenges that once overcome, will allow ...

COMBAT: mobile-Cloud-based cOmpute/coMmunications infrastructure for BATtlefield applications

The amount of data processed annually over the Internet has crossed the zetabyte boundary, yet this Big Data cannot be efficiently processed or stored using today's mobile devices. Parallel to this explosive growth in data, a substantial increase in mobile compute-capability and the advances in cloud computing have brought the state-ofthe-art in mobile-cloud computing to an inflection point, where the right architecture may allow mobile devices to run applications utilizing Big Data and intensive computing. In this paper, we propose the MObile Cloud-based Hybrid Architecture (MOCHA), which formulates a solution to permit mobile-cloud computing applications such as object recognition in the battlefield by introducing a mid-stage compute-and storage-layer, called the cloudlet. MOCHA is built on the key observation that many mobile-cloud applications have the following characteristics: 1) they are compute-intensive, requiring the compute-power of a supercomputer, and 2) they use Big Data, requiring a communications link to cloud-based database sources in near-real-time. In this paper, we describe the operation of MOCHA in battlefield applications, by formulating the aforementioned mobile and cloudlet to be housed within a soldier's vest and inside a military vehicle, respectively, and enabling access to the cloud through high latency satellite links. We provide simulations using the traditional mobile-cloud approach as well as utilizing MOCHA with a mid-stage cloudlet to quantify the utility of this architecture. We show that the MOCHA platform for mobile-cloud computing promises a future for critical battlefield applications that access Big Data, which is currently not possible using existing technology.

A Comprehensive Review on Edge Computing

International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2023

The Edge computing paradigm has experienced significant growth in both academic and professional circles in recent years. By linking cloud computing resources and services to the end users, it acts as a crucial enabler for several emerging technologies, including 5G, the Internet of Things (IoT), augmented reality, and vehicle-to-vehicle communications. Applications that require low latency, mobility, and location awareness are supported by the edge computing paradigm. Significant research has been done in the field of edge computing, which is examined in terms of recent advances like mobile edge computing, cloudlets, and fog computing. This has allowed academics to gain a deeper understanding of both current solutions and potential future applications. This article aims to provide a thorough overview of current developments in edge computing while emphasising the key applications. In real-world situations where response time is a crucial need for many applications, it also examines the significance of edge computing. The prerequisites and open research issues in edge computing are discussed in the article's conclusion.

Edge-as-a-Service: Towards Distributed Cloud Architectures

2017

We present an Edge-as-a-Service (EaaS) platform for realising distributed cloud architectures and integrating the edge of the network in the computing ecosystem. The EaaS platform is underpinned by (i) a lightweight discovery protocol that identifies edge nodes and make them publicly accessible in a computing environment, and (ii) a scalable resource provisioning mechanism for offloading workloads from the cloud on to the edge for servicing multiple user requests. We validate the feasibility of EaaS on an online game use-case to highlight the improvement in the QoS of the application hosted on our cloud-edge platform. On this platform we demonstrate (i) low overheads of less than 6%, (ii) reduced data traffic to the cloud by up to 95% and (iii) minimised application latency between 40%-60%.