Fog Computing Service Orchestration Mechanisms for 5G Networks (original) (raw)
Related papers
Fog Computing as a Support for 5G Network
Journal of Emerging research and solutions in ICT, 2016
5G will enable new future Internet of Services (IoSs) paradigms such as Anything as a Service (AaaS), where devices, terminals, machines, also smart things and robots will become innovative tools that will produce and will use applications, services and data. However, 5G will have to support huge mobile traffic volumes, and will also have to deal with the rapid increase of new and complex applications and services. On the other hand, Fog Computing, which extends Cloud Computing to the edge of the network, with its service orchestration mechanisms offers virtually unlimited dynamic resources for computation, storage and service provision, that will overcome the constraints of smart mobile devices. 5G in the fog computing environment will create opportunities for companies to deploy many new real-time services that cannot be delivered over current mobile and wireless networks. This paper evaluates Fog Computing as a support mechanism for 5G Network in terms of latency, throughput, and energy efficiency.
TelcoFog: A unified flexible fog and cloud computing architecture for 5G networks
Zenodo (CERN European Organization for Nuclear Research), 2017
We propose the TelcoFog architecture as a novel, secure, highly distributed and ultradense fog computing infrastructure, which can be allocated at the extreme edge of a wired/wireless network for a Telecom Operator to provide multiple unified, cost-effective and new 5G services, such as Network Function Virtualization (NFV), Mobile Edge Computing (MEC), and services for third parties (e.g., smart cities, vertical industries or Internet of Things (IoT)). The distributed and programmable fog technologies that are proposed in TelcoFog are expected to strengthen the position of the Mobile Network and cloud markets. TelcoFog, by design, is capable of integrating an ecosystem for network operators willing to provide NFV, MEC and IoT services. TelcoFog key benefits are the dynamic deployment of new distributed low-latency services. The novel TelcoFog architecture consists of three main building blocks: a) a scalable TelcoFog node, that is seamlessly integrated in the Telecom infrastructure; b) a TelcoFog controller, focused on service assurance and based on service data modeling using YANG, that is integrated in the management and orchestration architecture of the Telecom operator; and c) TelcoFog services, which are able to run on top of the TelcoFog and Telecom infrastructure. The TelcoFog architecture is validated through a Proof of Concept for IoT services.
Fog computing in 5G networks: an application perspective
Cloud and Fog Computing in 5G Mobile Networks: Emerging advances and applications
Fifth generation (5G) cellular network promises to offer to its users submillisecond latency and 1 gigabit per second transmission speed. However, the current cloud based computation and data delivery model do not allow these quality of service (QoS) guarantees to be efficiently harnessed, due to the number of hops of wired networks between the 5G base stations and the cloud, that leads to a significant increase in latency. Forwarding all the data generated by devices directly to the cloud may devour the bandwidth and lead to congestion. Therefore, it is necessary that processing be hosted near the devices, close to the source of the data, so that the high speed transmission of 5G can be utilized and data can be processed and filtered out by the time it reaches the cloud. This bringing down of computation, storage and networking services to the network edge opens up many new research areas of applying Fog computing over cellular network architecture. This chapter discusses the advantages of extending the cloud services to the edge by presenting use-cases which can be realized by fog computing over 5G networks.
Scalable Comput. Pract. Exp., 2019
Cloud computing has been established as the most popular as well as suitable computing infrastructure providing on-demand, scalable and pay-as-you-go computing resources and services for the state-of-the-art ICT applications which generate a massive amount of data. Though Cloud is certainly the most fitting solution for most of the applications with respect to processing capability and storage, it may not be so for the real-time applications. The main problem with Cloud is the latency as the Cloud data centres typically are very far from the data sources as well as the data consumers. This latency is ok with the application domains such as enterprise or web applications, but not for the modern Internet of Things (IoT)-based pervasive and ubiquitous application domains such as autonomous vehicle, smart and pervasive healthcare, real-time traffic monitoring, unmanned aerial vehicles, smart building, smart city, smart manufacturing, cognitive IoT, and so on. The prerequisite for these types of application is that the latency between the data generation and consumption should be minimal. For that, the generated data need to be processed locally, instead of sending to the Cloud. This approach is known as Edge computing where the data processing is done at the network edge in the edge devices such as set-top boxes, access points, routers, switches, base stations etc. which are typically located at the edge of the network. These devices are increasingly being incorporated with significant computing and storage capacity to cater to the need for local Big Data processing. The enabling of Edge computing can be attributed to the Emerging network technologies, such as 4G and cognitive radios, high-speed wireless networks, and energy-efficient sophisticated sensors. Different Edge computing architectures are proposed (e.g., Fog computing, mobile edge computing (MEC), cloudlets, etc.). All of these enable the IoT and sensor data to be processed closer to the data sources. But, among them, Fog computing, a Cisco initiative, has attracted the most attention of people from both academia and corporate and has been emerged as a new computing-infrastructural paradigm in recent years. Though Fog computing has been proposed as a different computing architecture than Cloud, it is not meant to replace the Cloud. Rather, Fog computing extends the Cloud services to network edges for providing computation, networking, and storage services between end devices and data centres. Ideally, Fog nodes (edge devices) are supposed to pre-process the data, serve the need of the associated applications preliminarily, and forward the data to the Cloud if the data are needed to be stored and analysed further. Fog computing enhances the benefits from smart devices operational not only in network perimeter but also under cloud servers. Fog-enabled services can be deployed anywhere in the network, and with these services provisioning and management, huge potential can be visualized to enhance intelligence within computing networks to realize context-awareness, high response time, and network traffic offloading. Several possibilities of Fog computing are already established. For example, sustainable smart cities, smart grid, smart logistics, environment monitoring, video surveillance, etc. To design and implementation of Fog computing systems, various challenges concerning system design and implementation, computing and communication, system architecture and integration, application-based implementations, fault tolerance, designing efficient algorithms and protocols, availability and reliability, security and privacy, energy-efficiency and sustainability, etc. are needed to be addressed. Also, to make Fog compatible with Cloud several factors such as Fog and Cloud system integration, service collaboration between Fog and Cloud, workload balance between Fog and Cloud, and so on need to be taken care of.
Class of service in fog computing
IEEE Latin-American Conference on Communications, 2017
Although Fog computing specifies a scalable architecture for computation, communication and storage, there is still a demand for better Quality of Service (QoS), especially for agile mobile services. Both industry and academia have been working on novel and efficient mechanisms for QoS provisioning in Fog computing. This paper presents a classification of services according to their QoS requirements as well as Class of Service for fog applications. This will facilitate the decision-making process for fog scheduler, and specifically to identify the timescale and location of resources, helping to make scalable the deployment of new applications. Moreover, this paper introduces a mapping between the proposed classes of service and the processing layers of the Fog computing reference architecture. The paper also discusses use cases in which the proposed classification of services would be helpful.
The confluence of Cloud computing, 5G, and IoT in the Fog
2019
In the wake of the arrival of cloud computing, future applications are poised to be- come more resilient and adaptive by embracing elasticity in an osmotic manner. Although cloud computing is a strong attractor for application developers, thereare still unconquered performance frontiers. Latency-sensitive and mission-critical ap- plications make up a significant portion of all software systems, and their owners are eager to reap the benefits of cloud computing. However, they are hindered by signific- ant delay, jitter in the delay, and relatively low resilience when operating on traditional, distant, cloud data centres.Fog computing is emerging as a remedy. Fog computing is a heterogeneous hyper- distributed cloud infrastructure paradigm, ranging from small compute nodes close to the end-users to traditional distant data centres. With greater proximity to the end- users, delay and jitter in the delay can be reduced, and intermediate network reliability improved. Additionally, with i...
The Rise of Fog Computing in the Digital Era, 2019
Pushing computing, control, data storage, and processing into the cloud has been a key trend in the past decade. However, the cloud alone encounters growing limitations, such as reduced latency, high mobility, high scalability, and real-time execution in order to meet the upcoming computing and intelligent networking demands. A new paradigm called fog computing has emerged to overcome these limits. Fog extends cloud computing and services to the edge of the network. It provides data, computing, storage, and application services to end-users that can be hosted at the network edge. It reduces service latency, and improves QoS/QoE, that results in superior user experience. This chapter is about introduction and overview of fog computing, comparison between fog computing and cloud computing, fog computing and mobile edge computing, possible fog computing architecture, applications of fog computing, and possible research directions.
Fog orchestration for the Internet of Everything: state-of-the-art and research challenges
Journal of Internet Services and Applications
Recent developments in telecommunications have allowed drawing new paradigms, including the Internet of Everything, to provide services by the interconnection of different physical devices enabling the exchange of data to enrich and automate people's daily activities; and Fog computing, which is an extension of the well-known Cloud computing, bringing tasks to the edge of the network exploiting characteristics such as lower latency, mobility support, and location awareness. Combining these paradigms opens a new set of possibilities for innovative services and applications; however, it also brings a new complex scenario that must be efficiently managed to properly fulfill the needs of the users. In this scenario, the Fog Orchestrator component is the key to coordinate the services in the middle of Cloud computing and Internet of Everything. In this paper, key challenges in the development of the Fog Orchestrator to support the Internet of Everything are identified, including how they affect the tasks that a Fog service Orchestrator should perform. Furthermore, different service Orchestrator architectures for the Fog are explored and analyzed in order to identify how the previously listed challenges are being tackled. Finally, a discussion about the open challenges, technological directions, and future of the research on this subject is presented.
Orchestration in Fog Computing: A Comprehensive Survey
ACM Computing Surveys, 2023
Fog computing is a paradigm that brings computational resources and services to the network edge in the vicinity of user devices, lowering latency and connecting with cloud computing resources. Unlike cloud computing, fog resources are based on constrained and heterogeneous nodes whose connectivity can be unstable. In this complex scenario, there is a need to define and implement orchestration processes to ensure that applications and services can be provided, considering the settled agreements. Although some publications have dealt with orchestration in fog computing, there are still some diverse definitions and functional intersection with other areas, such as resource management and monitoring. This article presents a systematic review of the literature with focus on orchestration in fog computing. A generic architecture of fog orchestration is presented, created from the consolidation of the analyzed proposals, bringing to light the essential functionalities addressed in the lit...
Fog Orchestration for Internet of Things Services
Large-scale Internet of Things (IoT) services such as healthcare, smart cities, and marine monitoring are pervasive in cyber-physical environments. These complex IoT services are increasingly composed of sensors, devices, and compute resources within fog computing infrastructures. Orchestrating such applications can simplify maintenance and enhance data security and system reliability. However, efficiently dealing with these services' dynamic variations and transient operational behavior is a crucial challenge. This article provides an overview of the core issues, challenges, and future research directions in fog-enabled orchestration for IoT services.