Bridging the Divide: Unleashing the Potential of the Edge-To-Cloud Continuum in Real-Time Applications (original) (raw)

Edge of Things: The Big Picture on the Integration of Edge, IoT and the Cloud in a Distributed Computing Environment

A centralized infrastructure system carries out existing data analytics and decision-making processes from our current highly virtualized platform of wireless networks and the Internet of Things (IoT) applications. There is a high possibility that these existing methods will encounter more challenges and issues in relation to network dynamics, resulting in a high overhead in the network response time, leading to latency and traffic. In order to avoid these problems in the network and achieve an optimum level of resource utilization, a new paradigm called edge computing (EC) is proposed to pave the way for the evolution of new age applications and services. With the integration of EC, the processing capabilities are pushed to the edge of network devices such as smart phones, sensor nodes, wearables, and on-board units, where data analytics and knowledge generation are performed which removes the necessity for a centralized system. Many IoT applications, such as smart cities, the smart grid, smart traffic lights, and smart vehicles, are rapidly upgrading their applications with EC, significantly improving response time as well as conserving network resources. Irrespective of the fact that EC shifts the workload from a centralized cloud to the edge, the analogy between EC and the cloud pertaining to factors such as resource management and computation optimization are still open to research studies. Hence, this paper aims to validate the efficiency and resourcefulness of EC. We extensively survey the edge systems and present a comparative study of cloud computing systems. After analyzing the different network properties in the system, the results show that EC systems perform better than cloud computing systems. Finally, the research challenges in implementing an EC system and future research directions are discussed.

Why cloud applications are not ready for the edge (yet)

Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, 2019

Mobile Edge Clouds (MECs) are distributed platforms in which distant data-centers are complemented with computing and storage capacity located at the edge of the network. Their wide resource distribution enables MECs to ful ll the need of low latency and high bandwidth to o⇥er an improved user experience. As modern cloud applications are increasingly architected as collections of small, independently deployable services, they can be ⇤exibly deployed in various con gurations that combines resources from both centralized datacenters and edge locations. In principle, such applications should therefore be well-placed to exploit the advantages of MECs so as to reduce service response times. In this paper, we quantify the bene ts of deploying such cloud micro-service applications on MECs. Using two popular benchmarks, we show that, against conventional wisdom, end-to-end latency does not improve signi cantly even when most application services are deployed in the edge location. We developed a proler to better understand this phenomenon, allowing us to develop recommendations for adapting applications to MECs. Further, by quantifying the gains of those recommendations, we show that the performance of an application can be made to reach the ideal scenario, in which the latency between an edge datacenter and a remote datacenter has no impact on the application performance. This work thus presents ways of adapting cloud-native applications to take advantage of MECs and provides guidance for developing MEC-native applications. We believe that both these elements are necessary to drive MEC adoption. CCS CONCEPTS • Networks ⌅ Network measurement; • Computer systems organization ⌅ Cloud computing; • Software and its engineering ⌅ Software design engineering.

Challenges and Opportunities in Edge Computing

Many cloud-based applications employ a data centre as a central server to process data that is generated by edge devices, such as smartphones, tablets and wearables. This model places ever increasing demands on communication and computational infrastructure with inevitable adverse effect on Quality-of-Service and Experience. The concept of Edge Computing is predicated on moving some of this computational load towards the edge of the network to harness computational capabilities that are currently untapped in edge nodes, such as base stations, routers and switches. This position paper considers the challenges and opportunities that arise out of this new direction in the computing landscape.

Integration of edge computing with cloud computing

2017 International Conference on Emerging Trends in Computing and Communication Technologies (ICETCCT)

In this paper, we try to integrate edge computing with the cloud computing paradigm. There are several issues and challenges in cloud computing paradigm which can be resolved by edge computing. So, in this paper we have discussed about those issues and given an idea that how edge computing is able to resolve that issues. In this paper, first we have discussed about edge computing with its definition and data flow diagram followed by its applications in various fields like smart city, smart home, cloud offloading and image and video analysis, then at last we have discussed about the issues and challenges of cloud computing followed by the idea of how edge computing can solve that issues.

From Cloud to Edge: Empowering Intelligent Applications with Cloud-Native Technologies

International Journal of Science, Engineering and Technology, 2024

Edge computing's potential for real-time, data-driven applications is often hindered by traditional architectures' limitations in scalability and resource management. Cloud-native edge computing, integrating cloud technologies like containerization and orchestration, offers a solution. This paper explores cloud-native edge computing's architectural patterns, key technologies, benefits, and challenges. We highlight advantages like improved scalability and resource efficiency while addressing security and heterogeneity concerns. Real-world use cases across various industries demonstrate its transformative impact. The paper concludes by outlining future research opportunities in this rapidly evolving field.

Edge Computing Technology Enablers: A Systematic Lecture Study

IEEE Access

With the increasing stringent QoS constraints (e.g., latency, bandwidth, jitter) imposed by novel applications (e.g., e-Health, autonomous vehicles, smart cities, etc.), as well as the rapidly increasing number of connected IoT (Internet of Things) devices, the core network is becoming increasingly congested. To cope with those constraints, Edge Computing (EC) is emerging as an innovative computing paradigm that leverages Cloud computing and brings it closer to the customer. ''EC'' refers to transferring computing power and intelligence from the central Cloud to the network's Edge. With that, EC promotes the idea of processing and caching data at the Edge, thus reducing network congestion and latency. This paper presents a detailed, thorough, and well-structured assessment of Edge Computing and its enabling technologies. Initially, we start by defining EC from the ground up, outlining its architectures and evolution from Cloudlets to Multi-Access Edge Computing. Next, we survey recent studies on the main cornerstones of an EC system, including resource management, computation offloading, data management, network management, etc. Besides, we emphasized EC technology enablers, starting with Edge Intelligence, the branch of Artificial Intelligence (AI) that integrates AI models at resource-constrained edge nodes with significant heterogeneity and mobility. Then, moving on to 5G and its empowering technologies, we explored how EC and 5G complement each other. After that, we studied virtualization and containerization as promising hosting runtime for edge applications. Further to that, we delineated a variety of EC use-case scenarios, e.g., smart cities, e-Health, military applications, etc. Finally, we concluded our survey by highlighting the role of EC integration with future concerns regarding green energy and standardization. INDEX TERMS Edge computing, cloud computing, fog computing, multi-access edge computing, edge intelligence, 5G, containerization.

IJERT-An Overview of Edge Computing

International Journal of Engineering Research and Technology (IJERT), 2019

https://www.ijert.org/an-overview-of-edge-computing https://www.ijert.org/research/an-overview-of-edge-computing-IJERTCONV7IS05016.pdf Cloud Computing has revolutionized how people store and use their data. However, there are some areas where cloud is limited; latency, bandwidth, security and a lack of offline access can be problematic. To solve this problem, users need robust, secure and intelligent on-premise infrastructure of edge computing. When data is physically located closer to the users who connect to it, information can be shared quickly, securely, and without latency. In financial services, gaming, health care and retail, low levels of latency are vital for a great digital expertise. To improve reliability and faster response time, combine cloud with edge infrastructure.

Quality of Service Aware Orchestration for Cloud–Edge Continuum Applications

Sensors

The fast growth in the amount of connected devices with computing capabilities in the past years has enabled the emergence of a new computing layer at the Edge. Despite being resource-constrained if compared with cloud servers, they offer lower latencies than those achievable by Cloud computing. The combination of both Cloud and Edge computing paradigms can provide a suitable infrastructure for complex applications’ quality of service requirements that cannot easily be achieved with either of these paradigms alone. These requirements can be very different for each application, from achieving time sensitivity or assuring data privacy to storing and processing large amounts of data. Therefore, orchestrating these applications in the Cloud–Edge computing raises new challenges that need to be solved in order to fully take advantage of this layered infrastructure. This paper proposes an architecture that enables the dynamic orchestration of applications in the Cloud–Edge continuum. It fo...

Bringing the Cloud to the Edge -- A brief look at Edge/Fog Computing and its applicability to the Internet of Things and Big Data Analytics

2016

The “Cloud” is considered as the powerhouse that will fuel and support the expansion of IoT. The Internet of Things (IoT) continues to gain momentum as vendors and enterprises begin to embrace the opportunities this market presents. According to new research from International Data Corporation (IDC), the worldwide Internet of Things market will grow from 655.8billionin2014to655.8 billion in 2014 to 655.8billionin2014to1.7 trillion in 2020 with a compound annual growth rate (CAGR) of 16.9% [11] with devices, connectivity and IT services taking a majority stake in the market of IoT. This emerging wave of end-computing deployment requires mobility support, geo-distribution, location awareness and most notably very low latency. Will the Cloud be able to provide these features? Or maybe, the right question to ask is if it will be able to sustain the expected growth of IoT, with billions of devices communicating over data shared across inter-clouds while providing the kind of quality of service that we have come to expect over time. In this paper, we pore over the existing Cloud Computing landscape and contemplate its place in the “Things to Come” era of computing. We look at new hierarchical distributed architectures that extend from the edge of the network to the core of the cloud and delve into the idea of extending out the cloud and bringing it to the edge of the compute endpoints. Something that is now being called “The Fog”