IoT enabled Smart Fog Computing for Vehicular Traffic Control (original) (raw)
Related papers
Internet of Things, 2019
INTRODUCTION: Internet was initially designed to connect web sites and portals with data packets flowing over the networks for communications at corporate levels. Over time, live video streaming, real-time data and voice is being offered over hosted Clouds for business entertainment. Enterprise applications like Office 365, banking and e-commerce are available over smartphones. With the advent of Fog Computing and Internet of Things, corporate enterprises and non-IT industries see potential in this technology. Billions of Internet-enabled devices, globally distributed nodes, embedded sensor gateways transmit real-time generated over the internet to the cloud data centres. Cloud environments are not designed to handle this level of data that is being generated and Computing limits are being severely tested. Fog Computing has the potential to be the go-to option for Cloud service delivery. OBJECTIVES: This paper reviewed existing research works and presents unique Smart Fog Computing based taxonomy. The authors also implemented experimental setup for Smart Cities using Smart Fog Computing for controlling Vehicular traffic. METHODS: Smart Vehicular Management is viable use case for Fog and IoT technology. The authors designed and implemented two experimental setups. The first setup involves standard Cloud implementation and the second setup employs Fog Computing implemented using IoT Sensor nodes to compare the performance of the Vehicle Management Fog application regarding the Response time and Bandwidth Consumed. The architecture and implementation involved deploying 50 IoT sensors nodes across the university areas and routes. RESULTS: The main results obtained in this paper are the following. As compared to Cloud computing, on deploying Fog Computing and IoT devices: End-to-End Processing time dropped from 29.44 to 6.7 seconds almost 77% less Number of hops traversed reduced from 56 to 4 hops almost 92% less Bandwidth usage dropped from 247 to 8 kbps almost 96.7% less CONCLUSION: From the experimental setups as compared to Cloud computing, the Fog and IoT processes the traffic data locally on the edge devices, which reduces the end-to-end time.
REVIEW OF IMPLEMENTING FOG COMPUTING
Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. It is a model in which data, processing and applications are concentrated in devices at the network edge rather than existing almost entirely in the cloud. This document describes the various features of Fog Computing and a case study along with the actual implementation of fog computing in traffic analysis to understand how fog computing is applied to the edge environment. This document also contains the difference between the fog computing and cloud computing.
Fog Computing An Empirical Study
2020
In the Field of Internet of Things IOT the devices by themselves can recognize the environment and conduct a certain functions by itself. IOT Devices majorly consist with sensors. Cloud Computing which is based in sensor networks manages huge amount of data which includes transferring and processing which takes delayed in service response time. As the growth of sensor network is increased, the demand to control and process the data on IOT devices is also increasing. Vikas Vashisth | Harshit Gupta | Dr. Deepak Chahal "Fog Computing: An Empirical Study" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30675.pdf Paper Url :https://www.ijtsrd.com/computer-science/realtime-computing/30675/fog-computing-an-empirical-study/vikas-vashisth
Scalable Comput. Pract. Exp., 2019
Cloud computing has been established as the most popular as well as suitable computing infrastructure providing on-demand, scalable and pay-as-you-go computing resources and services for the state-of-the-art ICT applications which generate a massive amount of data. Though Cloud is certainly the most fitting solution for most of the applications with respect to processing capability and storage, it may not be so for the real-time applications. The main problem with Cloud is the latency as the Cloud data centres typically are very far from the data sources as well as the data consumers. This latency is ok with the application domains such as enterprise or web applications, but not for the modern Internet of Things (IoT)-based pervasive and ubiquitous application domains such as autonomous vehicle, smart and pervasive healthcare, real-time traffic monitoring, unmanned aerial vehicles, smart building, smart city, smart manufacturing, cognitive IoT, and so on. The prerequisite for these types of application is that the latency between the data generation and consumption should be minimal. For that, the generated data need to be processed locally, instead of sending to the Cloud. This approach is known as Edge computing where the data processing is done at the network edge in the edge devices such as set-top boxes, access points, routers, switches, base stations etc. which are typically located at the edge of the network. These devices are increasingly being incorporated with significant computing and storage capacity to cater to the need for local Big Data processing. The enabling of Edge computing can be attributed to the Emerging network technologies, such as 4G and cognitive radios, high-speed wireless networks, and energy-efficient sophisticated sensors. Different Edge computing architectures are proposed (e.g., Fog computing, mobile edge computing (MEC), cloudlets, etc.). All of these enable the IoT and sensor data to be processed closer to the data sources. But, among them, Fog computing, a Cisco initiative, has attracted the most attention of people from both academia and corporate and has been emerged as a new computing-infrastructural paradigm in recent years. Though Fog computing has been proposed as a different computing architecture than Cloud, it is not meant to replace the Cloud. Rather, Fog computing extends the Cloud services to network edges for providing computation, networking, and storage services between end devices and data centres. Ideally, Fog nodes (edge devices) are supposed to pre-process the data, serve the need of the associated applications preliminarily, and forward the data to the Cloud if the data are needed to be stored and analysed further. Fog computing enhances the benefits from smart devices operational not only in network perimeter but also under cloud servers. Fog-enabled services can be deployed anywhere in the network, and with these services provisioning and management, huge potential can be visualized to enhance intelligence within computing networks to realize context-awareness, high response time, and network traffic offloading. Several possibilities of Fog computing are already established. For example, sustainable smart cities, smart grid, smart logistics, environment monitoring, video surveillance, etc. To design and implementation of Fog computing systems, various challenges concerning system design and implementation, computing and communication, system architecture and integration, application-based implementations, fault tolerance, designing efficient algorithms and protocols, availability and reliability, security and privacy, energy-efficiency and sustainability, etc. are needed to be addressed. Also, to make Fog compatible with Cloud several factors such as Fog and Cloud system integration, service collaboration between Fog and Cloud, workload balance between Fog and Cloud, and so on need to be taken care of.
Fog Computing-Based Model for Mitigation of Traffic Congestion
Fog computing enables processing at the edge, while still offering the possibility to interact with the cloud. The massive amounts of data produced from connected applications, as well as the latency-critical issue have motivated researchers to adopt fog computing technology. Development of an effective and reliable traffic system has always been a worldwide concern especially in cities characterized by heavy traffic and high traffic congestion index. Fog computing can help in road traffic congestion mitigation. This paper introduces fog computing and its relevance to both cloud computing and the internet of things (IoT) along with a model for mitigation of traffic congestion. Fog nodes are proposed to be assigned tasks of acquiring, analyzing, and processing local traffic data at traffic junctions. The obtained results show that more reliable traffic is achieved, where congestion could be reduced through homogeneous traffic distribution by regarding the average rate of traffic flow among the fog nodes.
Performance Analysis of Internet of Things Protocols Based Fog/Cloud over High Traffic
The Internet of Things (IoT) becomes the future of a global data field in which the embedded devices communicate with each other, exchange data and making decisions through the Internet. IoT could improves the qualityoflife in smart cities, but a massive amount of data from different smart devices could slow down or crash database systems. In addition, IoT data transfer to Cloud for monitoring information and generating feedback thus will lead to highdelay in infrastructure level. Fog Computing can help by offering services closer to edge devices. In this paper, we propose an efficient system architecture to mitigate the problem of delay. We provide performance analysis like responsetime, throughput and packet loss for MQTT (Message Queue Telemetry Transport) and HTTP (Hyper Text Transfer Protocol) protocols based on Cloud or Fog serverswith large volume of data form emulated traffic generator working alongsidewith one real sensor. We implement both protocols in the same architecture, with low cost embedded devices to local and Cloud servers with different platforms. The results show that HTTP response time is 12.1 and 4.76 times higher than MQTT Fog and cloud based located in the same geographical area of the sensors respectively. The worst case in performance is observed when the Cloud is public and outside the country region. The results obtained for throughput shows that MQTT has the capability to carry the data with available bandwidth and lowest percentage of packet loss. We also prove that the proposed Fog architecture is an efficient way to reduce latency and enhance performance in Cloud based IoT.
Design, Resource Management, and Evaluation of Fog Computing Systems: A Survey
IEEE Internet of Things Journal, 2021
A steady increase in Internet of Things (IoT) applications needing large-scale computation and long-term storage has lead to an over-reliance on Cloud computing. The resulting network congestion in Cloud, coupled with the distance of Cloud data centres from IoT, contribute to unreliable endto-end response delay. Fog computing has been introduced as an alternative to cloud, providing low-latency service by bringing processing and storage resources to the network edge. In this survey, we sequentially present the phases required in the implementation and realization of practical fog computing systems: (1) design & dimensioning of a fog infrastructure, (2) fog resource provisioning for IoT application use and IoT resource allocation to fog, (3) installation of fog frameworks for fog resource management, and (4) evaluation of fog infrastructure through simulation & emulation. Our focus is determining the implementation aspects required to build a practical large scale fog computing infrastructure to support the general IoT landscape.
360 Deg. Overview of Fog Computing
International Journal for Research in Applied Science and Engineering Technology, 2019
Fog Computing (introduced in 2012) is now considered to be the most prioritized choice for applications of Internet of Things. Fog Computing allows to move a part of the processing in the cloud to the network devices present along the node to the cloud. It makes communications and storage services in proximity to the end user. It is used to well support time dependent, location dependent, massive scale, and latency sensitive applications. Fog computing makes the task laid-back for cloud by filtering the needless data before forwarding the data to the cloud. This paper is assembled into two parts. The first part covers the basic outline and architecture of fog. The second part talks about the working of Fog Computing and its benefits and limitations.
Conception et évaluation de l'architecture Fog Computing
2018
Fog computing is an emerging computing paradigm for enabling the cloud services at network edge devices such as routers, gateways, and set-up boxes. Utilizing fog computing, constrained devices can utilize cloud resources within one-hop network and thereby, resource-intensive tasks can be offloaded to resource-rich nodes. Autonomous vehicles are revolutionising centuries-old transportation system. Self-driving cars are the enablers of urban Intelligent Transport System (ITS), and paving way for smarter cities. Autonomous vehicles must operate on huge dynamic datasets with limited resources and time. Thus, application of fog computing in vehicular network appeals to be a promising technological advancement. For the same cause, fog computing network is designed and evaluated in a simulated environment as the primary motive of this thesis. The simulation setup is based on distributed modular server architecture. In this design, each server machine has one-to-one mapping with client. Th...
Issues and Research Paths in Fog Computing
International Journal for Research in Applied Science and Engineering Technology (IJRASET), 2022
Cloud computing is a framework that provides data storage and data processing to edge network users. Until recently, cloud has been a great solution to access our data and process them any time and everywhere. But the price decrease of connected devices with internet network increases the end users' number in the edge network. Consequently, the data coming from edge network will be concentrated around the cloud. This causes congestion and significant response latency of data. Fog computing is installed as a solution for the congestion problem; it is an extension of cloud placed closer to each area of end users. This solution provides low response latency for devices that request data from cloud. It also provides processing and storage features to IoT/sensors witch do not adopt them. In this work, we present Fog computing by defining the limits of cloud computing which led to creating the Fog. Afterwards, we set the Fog computing architecture by underscoring its difference from the cloud. We also specify some issues to improve paths of this new technology. Finally, we present some related works.