Dr. Krishan Kumar Sharma | Lingaya's University (original) (raw)
Papers by Dr. Krishan Kumar Sharma
Wireless Sensor Networks have been increasing with the use of advanced methods to develop these k... more Wireless Sensor Networks have been increasing with the use of advanced methods to develop these kinds of networks in various fields like biometrics, weather monitoring, military applications etc. Wireless Networks are deployed in various regions and one of the key issues is that the sensor nodes are largely dependent on the batteries which are fitted in them at the time of installation. In most of the cases, the sensor nodes have to be largely dependent on the fixed non-rechargeable batteries. Thus it is for utmost importance to save the energy as much as possible and preserve the network energy and hence its lifetime. There have been several routing protocols designed and all are primarily focused on providing maximum network lifetime by ensuring the efficient use of sensor node energy. Among these protocols, the clustering based protocols are the most widely used and are extensively researched to devise new routing protocols to preserve the network energy. In this research work, the main focus is to study the various available clustering based routing protocols and observe their rules and features. An attempt is also made to devise a new protocol based on Fuzzy Based Stable Election Protocol for heterogeneous networks. A simulated virtual environment is envisaged to be built in MATLAB to understand the network setup and implement the algorithm for an energy efficient routing. A number of protocols have been studied in depth in this research work. The proposed protocol have shown improvement in a number of dead nodes and average energy per node as compared to the existing SEP protocol when a comparative study was done on similar simulation parameters on both the methods. I. INTRODUCTION A typical wireless sensor network (WSN) consists of a large number of distributed embedded devices which can be used are to collect data of various kinds like temperature, pressure, humidity, pest count etc. depending on what kind of sensor they are embedded with. The development of electronics and the availability of less costly on-chip devices has led to the development of various kinds of exclusive sensor network paradigms, environmental or physical conditions [1-2]. In WSN, however, due to the kind of areas where they are employed and the task they perform, such kinds of sensors are always constrained to various kinds of limitations in energy supply, computational capabilities, and bandwidth. The energy limitation is the most important of them all because it is directly related to the lifetime of the whole network. A network lifetime is simply the measure of how long the sensors sustain. The longer the battery sustains, the better the network lifetime. Thus conserving power is always an important factor while designing any wireless sensor network protocol. In this paper, a novel approach using fuzzy logic has been implemented for a heterogeneous network. The upcoming sections include a discussion on various methods proposed erstwhile by various researchers on methods for clustering, followed by a brief description of the proposed method and finally followed by analyzing the various results obtained after the simulation.
In current scenarios the demand for WSN had rapidly increased in various applications like weathe... more In current scenarios the demand for WSN had rapidly increased in various applications like weather monitoring, petroleum and military due to low power, small size, light weight, and wireless sensors. However, these inexpensive sensors are equipped with limited battery power and thus constrained in energy. One of the major issues with WSN is that one need to increase the lifetime of network. Generally, lifetime of network is defined as the time whenever the first node fails to send its information to base station. This issue can be resolved by implementing data aggregation technique as it decreases data traffic and further saves energy by merging multiple incoming packets into a single packet whenever the sensed information are highly correlated. Numerous researches have been carried out to further extent network lifetime. I. INTRODUCTION A wireless sensor network is typically composed of a large number of sensor nodes, which may be densely deployed either inside the phenomenon or very close to it, depending on the field of their usage. The position of sensor nodes need not be engineered or predetermined. This envisages random deployment in inaccessible terrains or disaster relief operations. On the other hand, this also indicates that sensor network protocols and algorithms need to possess self-organizing capabilities. Another unique feature of sensor networks is the cooperative effort of sensor nodes. Instead of sending the raw data to the nodes responsible for the fusion, sensor nodes use their processing abilities to locally carry out simple computations and transmit only the required and partially processed data. Realization of these and other sensor network applications require wireless ad hoc networking techniques. Although there have been a number of protocols and algorithms proposed for traditional wireless ad hoc networks, they are not well suited for the unique features and application requirements of wireless sensor networks. The differences between sensor networks and ad hoc networks are outlined below: The number of sensor nodes in a sensor network is higher than the nodes in an ad hoc network. Sensor nodes are densely deployed as compared to ad-hoc nodes. Sensor nodes are more prone to failures. The topology of a sensor network changes very frequently. Sensor nodes mainly use broadcast or multi-hop communication paradigm whereas most ad hoc networks are based on point-to-point communications. Sensor nodes have limitations of power, computational capacities, and memory. Sensor nodes may not have global identification (ID) because of the large amount of overhead and large number of sensors. Designing suitable routing algorithms for different applications, fulfilling the different performance demands has been considered as an important issue in wireless sensor networks. In these context many routing algorithms have been proposed to improve the performance demands of various applications through the network layer of the wireless sensor networks protocol stack [3, 4], but most of them are based on single-path routing. In single-path routing approach basically source selects a single path which satisfies the performance demands of the application for transmitting the load towards the sink. Though the single path between the source and sink can be developed with minimum computation complexity and resource utilization, the other factors such as the limited capacity of single path reduces the available throughput [5]. Secondly, considering the unreliable wireless links single path routing is not flexible to link failures, degrading the network performance. Finding an alternate path after the primary path has disrupted to continue the data transmission will cause an extra overhead and increase delay in data delivery. Due to these factors single path routing cannot be considered effective technique to meet the performance demands of various applications.
— As the network traffic increasing daily, the consumption of resources increasing with the same ... more — As the network traffic increasing daily, the consumption of resources increasing with the same rate. Cloud computing data centers are becoming gradually popular and important for provisioning the computer networking resources. In the past, it has been observed that a lot of importance was given to the resources that could be employed using Virtualization technology. But recently, the focus is moving from the data center for LAN and WAN to Cloud computing as the core system infrastructure. The requirements for such implementations depend on computer resources, storage and networking which in a way is becoming Software Defined Network (SDN). Instead of limiting the need for physical infrastructure and applications, the precisely defined network in virtual environment is the future. Software Defined Network plays an important role in providing the common platform to all the possible applications, innovations, operators, vendors and hardware components. The main objective of this literature review is to do research work and describes the ongoing efforts to address the thought-provoking issues.
— With the ever growing internet and interconnection among everything, the need for security is a... more — With the ever growing internet and interconnection among everything, the need for security is a demand of the time. Internet of things (IOT) which connects every object or device with networking capabilities is an area of great concern related to security. Objects include home automation sensors, medical equipments, vehicular sensors, nuclear reactors and any life critical real time sensing devices. This means that lack of security in IOT can pose a risk to human lives. IOT comprises of many heterogeneous devices which use diverse protocols. Each protocol follows different access mechanisms and security measures. But a unified security mechanism is still not in place in IOT. Conventional security approaches like Intrusion Detection and Prevention Systems (IDPS), Firewall are deployed at internet edge devices to protect from external attacks. But in case of IOT which is seamless and borderless network access control become more difficult. In this paper, we discuss the current state of IOT, security challenges in IOT. We describe the need for SDN and its evolution. Further we analyze architecture of IOT based on SDN. Finally a security framework has been proposed based on SDN-IOT architecture. —————————— ——————————
— In this paper of network function virtualization and its impact on 5G has been reviewed. Networ... more — In this paper of network function virtualization and its impact on 5G has been reviewed. Network function virtualization that virtualizes entire classes of network node functions into building blocks that may connect, or chain together, to create communication services. The goal of NFV is to shift the network functions from dedicated hardware devices and allow network services that are now being carried out by router firewalls load balancers and other dedicated hardware devices to be hosted on virtual machines (VMs). The NFV is important because it helps the network administrators no longer need to purchase dedicated hardware devices in order to build a service chain. Because server capacity will be able to be added through software, there will be no need for network administrators to add on their data centers, which will reduce both capital expenses (CAPex) and operating expenses (OPex). If an application running on a VM required more bandwidth, then the administrator could move the VM to another physical server or provide another virtual machine on the original server to take part of the load. This flexibility will allow an IT department to respond in a faster manner to change business goals and network service demands. The Aim of Network Function Virtualisation is to transform the way, the network operator's designs networks, by evolving standard IT virtualisation technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in Data centres, these virtual appliances can be instantiated on demand without the installation of new equipment.
—In this paper a modern radar signal processing various stages have been studied. In modern radar... more —In this paper a modern radar signal processing various stages have been studied. In modern radar sets the conversion of radar signals to digital form is typically accomplished after IF amplification and phase sensitive detection. The signal processor is that part of the system which separates targets from clutter on the basis of Doppler content and amplitude characteristics. All measures are referred to as radar signal processing and radar data processing that calculates from the received echo signal an evaluable image on the radar display. With radar signal processing is referred the part that is still dealing with the analogue (or often present even as a digital value) magnitude of the echo signal.
We present a cloud-enhanced, four-wheeled, mobile minirobot, assembled from low-cost, off-the-she... more We present a cloud-enhanced, four-wheeled, mobile minirobot, assembled from low-cost, off-the-shelf hardware parts, and open-source software building blocks. These building blocks provide a reusable & extensible base for emerging applications mixing robotics with the Internet of Things.
— This paper provides an overview of the Internet of Things (IoT) with emphasis on enabling techn... more — This paper provides an overview of the Internet of Things (IoT) with emphasis on enabling technologies, protocols, and application issues. The IoT is enabled by the latest developments in RFID, smart sensors, communication technologies, and Internet protocols. The basic premise is to have smart sensors collaborate directly without human involvement to deliver a new class of applications. The current revolution in Internet, mobile, and machine-to-machine (M2M) technologies can be seen as the first phase of the IoT. In the coming years, the IoT is expected to bridge diverse technologies to enable new applications by connecting physical objects together in support of intelligent decision making. This paper starts by providing a horizontal overview of the IoT. Then, we give an overview of some technical details that pertain to the IoT enabling technologies, protocols, and applications. Compared to other survey papers in the field, our objective is to provide a more thorough summary of the most relevant protocols and application issues to enable researchers and application developers to get up to speed quickly on how the different protocols fit together to deliver desired functionalities without having to go through RFCs and the standards specifications. We also provide an overview of some of the key IoT challenges presented in the recent literature and provide a summary of related research work. Moreover, we explore the relation between the IoT and other emerging technologies including big data analytics and cloud and fog computing. We also present the need for better horizontal integration among IoT services. Finally, we present detailed service use-cases to illustrate how the different protocols presented in the paper fit together to deliver desired IoT services.
— while searching for information about cloud computing over the internet, I came across a comput... more — while searching for information about cloud computing over the internet, I came across a computing keyword that is Green Computing. After so much of research on this term I myself tried to have the clear vision about it. This paper will help the researchers and administrators to have a clear understanding of Green Computing. In this modern era of globalization, computers play a vital role in every field, so the needs of computers increased day by day and, to full fill this need, a large amount of electricity required for manufacturing of computers functional units, such as CPU, memory, monitors and peripheral devices etc. By this a large amount of carbon content consumes in the atmosphere. These carbon contents directly or indirectly impact us and responsible for hazards on the earth. However, the growing demand of Cloud infrastructure has drastically increased the energy consumption of data centres, which has become a critical issue. High energy consumption not only translates to high operational cost, which reduces the profit margin of Cloud providers, but also leads to high carbon emissions which is not environmentally friendly. There are many technique used by intelligent people to overcome this problem, one of that technique is Green computing. Green Computing takes responsibility of eco-friendly use of computer and their resources. In order to design such solutions, deep analysis of Cloud is required with respect to their power efficiency. Mainly, IT industries are responsible for 3% of the world's energy consumption with an increase of 20% per year. The main goal of Green computing is to increase the efficiency of the IT products during its lifecycle and endorse the recycling of useless goods and factory waste. This paper introduces some energy efficient techniques of green computing and their use in future.
— In numerous parts of computing, there has been a continuous issue between the centralization an... more — In numerous parts of computing, there has been a continuous issue between the centralization and decentralization aspect which prompted to move from mainframes to PCs and local networks in the past, and union of services and applications in clouds and data centers. The expansion of technological advances such as high capacity mobile end-user devices, powerful dedicated connection boxes deployed in most homes, powerful wireless networks, and IoT (Internet of Things) devices along with developing client worries about protection, trust and independence calls for handling the information at the edge of the network. This requires taking the control of computing applications, information and services away from the core to the other the edge of the Internet. Relevance of cloud computing to mobile networks is on an upward spiral. Edge computing can possibly address the concerns of response time requirement, bandwidth cost saving, elastic scalability, battery life constraint, QoS, etc. MEC additionally offers, high bandwidth environment, ultra-low latency that gives real-time access to radio networks at the edge of the mobile network. Currently, it is being used for enabling on-demand elastic access to, or an interaction with a shared pool of reconfigurable computing resources such as servers, peer devices, storage, applications, and at the edge of the wireless network in close proximity to mobile users. It overcomes obstacles of traditional central clouds by offering wireless network information and local context awareness as well as low latency and bandwidth conservation. In this paper, we introduce edge computing and edge cloud, followed by why do we need edge computing, its classifications, various frameworks, applications and several case studies. Finally, we will present several challenges, concerns and future scope in the field of edge computing. Index Terms— Mobile Edge Computing (MEC), Internet of Things (IoT) —————————— ——————————
— The purpose of this paper is to do comparison of study of traffic handling efficiency of STM 16... more — The purpose of this paper is to do comparison of study of traffic handling efficiency of STM 16 devices between Huawei OptiX 3500 and Tejas TJ 1400.The present study will compare the efficiency of both the products. The development in transmission systems like MADM and DWDM is going towards higher bandwidth handling capacity of traffic. The development of mobile generation from 1G to 5G, requirement of higher bandwidth for lease-line, internet broadband and NGN, includes the requirement of higher bandwidth/traffic handling capacity of transmission system. All latest development in the field of PSTN, broadband and mobile needs higher bandwidth to meet the purpose of technology
A hot topic in today's context, FPGA implementation of Internet of Things (IOT) is recently intro... more A hot topic in today's context, FPGA implementation of Internet of Things (IOT) is recently introduced technique. The Internet of things is connecting people and smart devices on a scale that was once unimaginable. One more challenge for IOT is to handle vast amount of sensing the data generated from smart devices that are resource limited and subject to missing data due to link failures. By implementing IOT on FPGA platform, we present a concept in this paper, i.e. the use of low cost FPGA implementation of entire IOT subset including TCP/IP protocol, ControlSystem, Data Acquisition etc. The IOT applications on FPGA platform have received significant attention from the research community in the past few years. This technique offers a complete, low cost, powerful and user-friendly way of 24 hours real-time monitoring and remote sensing system. The main aim of this research is to highlight how the users can access the FPGA based design resources from anywhere. Thus we present a concept that shortens the application of momentarily unused resources for executing various tasks automatically.
Wireless Sensor Networks have been increasing with the use of advanced methods to develop these k... more Wireless Sensor Networks have been increasing with the use of advanced methods to develop these kinds of networks in various fields like biometrics, weather monitoring, military applications etc. Wireless Networks are deployed in various regions and one of the key issues is that the sensor nodes are largely dependent on the batteries which are fitted in them at the time of installation. In most of the cases, the sensor nodes have to be largely dependent on the fixed non-rechargeable batteries. Thus it is for utmost importance to save the energy as much as possible and preserve the network energy and hence its lifetime. There have been several routing protocols designed and all are primarily focused on providing maximum network lifetime by ensuring the efficient use of sensor node energy. Among these protocols, the clustering based protocols are the most widely used and are extensively researched to devise new routing protocols to preserve the network energy. In this research work, the main focus is to study the various available clustering based routing protocols and observe their rules and features. An attempt is also made to devise a new protocol based on Fuzzy Based Stable Election Protocol for heterogeneous networks. A simulated virtual environment is envisaged to be built in MATLAB to understand the network setup and implement the algorithm for an energy efficient routing. A number of protocols have been studied in depth in this research work. The proposed protocol have shown improvement in a number of dead nodes and average energy per node as compared to the existing SEP protocol when a comparative study was done on similar simulation parameters on both the methods. I. INTRODUCTION A typical wireless sensor network (WSN) consists of a large number of distributed embedded devices which can be used are to collect data of various kinds like temperature, pressure, humidity, pest count etc. depending on what kind of sensor they are embedded with. The development of electronics and the availability of less costly on-chip devices has led to the development of various kinds of exclusive sensor network paradigms, environmental or physical conditions [1-2]. In WSN, however, due to the kind of areas where they are employed and the task they perform, such kinds of sensors are always constrained to various kinds of limitations in energy supply, computational capabilities, and bandwidth. The energy limitation is the most important of them all because it is directly related to the lifetime of the whole network. A network lifetime is simply the measure of how long the sensors sustain. The longer the battery sustains, the better the network lifetime. Thus conserving power is always an important factor while designing any wireless sensor network protocol. In this paper, a novel approach using fuzzy logic has been implemented for a heterogeneous network. The upcoming sections include a discussion on various methods proposed erstwhile by various researchers on methods for clustering, followed by a brief description of the proposed method and finally followed by analyzing the various results obtained after the simulation.
In current scenarios the demand for WSN had rapidly increased in various applications like weathe... more In current scenarios the demand for WSN had rapidly increased in various applications like weather monitoring, petroleum and military due to low power, small size, light weight, and wireless sensors. However, these inexpensive sensors are equipped with limited battery power and thus constrained in energy. One of the major issues with WSN is that one need to increase the lifetime of network. Generally, lifetime of network is defined as the time whenever the first node fails to send its information to base station. This issue can be resolved by implementing data aggregation technique as it decreases data traffic and further saves energy by merging multiple incoming packets into a single packet whenever the sensed information are highly correlated. Numerous researches have been carried out to further extent network lifetime. I. INTRODUCTION A wireless sensor network is typically composed of a large number of sensor nodes, which may be densely deployed either inside the phenomenon or very close to it, depending on the field of their usage. The position of sensor nodes need not be engineered or predetermined. This envisages random deployment in inaccessible terrains or disaster relief operations. On the other hand, this also indicates that sensor network protocols and algorithms need to possess self-organizing capabilities. Another unique feature of sensor networks is the cooperative effort of sensor nodes. Instead of sending the raw data to the nodes responsible for the fusion, sensor nodes use their processing abilities to locally carry out simple computations and transmit only the required and partially processed data. Realization of these and other sensor network applications require wireless ad hoc networking techniques. Although there have been a number of protocols and algorithms proposed for traditional wireless ad hoc networks, they are not well suited for the unique features and application requirements of wireless sensor networks. The differences between sensor networks and ad hoc networks are outlined below: The number of sensor nodes in a sensor network is higher than the nodes in an ad hoc network. Sensor nodes are densely deployed as compared to ad-hoc nodes. Sensor nodes are more prone to failures. The topology of a sensor network changes very frequently. Sensor nodes mainly use broadcast or multi-hop communication paradigm whereas most ad hoc networks are based on point-to-point communications. Sensor nodes have limitations of power, computational capacities, and memory. Sensor nodes may not have global identification (ID) because of the large amount of overhead and large number of sensors. Designing suitable routing algorithms for different applications, fulfilling the different performance demands has been considered as an important issue in wireless sensor networks. In these context many routing algorithms have been proposed to improve the performance demands of various applications through the network layer of the wireless sensor networks protocol stack [3, 4], but most of them are based on single-path routing. In single-path routing approach basically source selects a single path which satisfies the performance demands of the application for transmitting the load towards the sink. Though the single path between the source and sink can be developed with minimum computation complexity and resource utilization, the other factors such as the limited capacity of single path reduces the available throughput [5]. Secondly, considering the unreliable wireless links single path routing is not flexible to link failures, degrading the network performance. Finding an alternate path after the primary path has disrupted to continue the data transmission will cause an extra overhead and increase delay in data delivery. Due to these factors single path routing cannot be considered effective technique to meet the performance demands of various applications.
— As the network traffic increasing daily, the consumption of resources increasing with the same ... more — As the network traffic increasing daily, the consumption of resources increasing with the same rate. Cloud computing data centers are becoming gradually popular and important for provisioning the computer networking resources. In the past, it has been observed that a lot of importance was given to the resources that could be employed using Virtualization technology. But recently, the focus is moving from the data center for LAN and WAN to Cloud computing as the core system infrastructure. The requirements for such implementations depend on computer resources, storage and networking which in a way is becoming Software Defined Network (SDN). Instead of limiting the need for physical infrastructure and applications, the precisely defined network in virtual environment is the future. Software Defined Network plays an important role in providing the common platform to all the possible applications, innovations, operators, vendors and hardware components. The main objective of this literature review is to do research work and describes the ongoing efforts to address the thought-provoking issues.
— With the ever growing internet and interconnection among everything, the need for security is a... more — With the ever growing internet and interconnection among everything, the need for security is a demand of the time. Internet of things (IOT) which connects every object or device with networking capabilities is an area of great concern related to security. Objects include home automation sensors, medical equipments, vehicular sensors, nuclear reactors and any life critical real time sensing devices. This means that lack of security in IOT can pose a risk to human lives. IOT comprises of many heterogeneous devices which use diverse protocols. Each protocol follows different access mechanisms and security measures. But a unified security mechanism is still not in place in IOT. Conventional security approaches like Intrusion Detection and Prevention Systems (IDPS), Firewall are deployed at internet edge devices to protect from external attacks. But in case of IOT which is seamless and borderless network access control become more difficult. In this paper, we discuss the current state of IOT, security challenges in IOT. We describe the need for SDN and its evolution. Further we analyze architecture of IOT based on SDN. Finally a security framework has been proposed based on SDN-IOT architecture. —————————— ——————————
— In this paper of network function virtualization and its impact on 5G has been reviewed. Networ... more — In this paper of network function virtualization and its impact on 5G has been reviewed. Network function virtualization that virtualizes entire classes of network node functions into building blocks that may connect, or chain together, to create communication services. The goal of NFV is to shift the network functions from dedicated hardware devices and allow network services that are now being carried out by router firewalls load balancers and other dedicated hardware devices to be hosted on virtual machines (VMs). The NFV is important because it helps the network administrators no longer need to purchase dedicated hardware devices in order to build a service chain. Because server capacity will be able to be added through software, there will be no need for network administrators to add on their data centers, which will reduce both capital expenses (CAPex) and operating expenses (OPex). If an application running on a VM required more bandwidth, then the administrator could move the VM to another physical server or provide another virtual machine on the original server to take part of the load. This flexibility will allow an IT department to respond in a faster manner to change business goals and network service demands. The Aim of Network Function Virtualisation is to transform the way, the network operator's designs networks, by evolving standard IT virtualisation technology to consolidate many network equipment types onto industry standard high volume servers, switches and storage, which could be located in Data centres, these virtual appliances can be instantiated on demand without the installation of new equipment.
—In this paper a modern radar signal processing various stages have been studied. In modern radar... more —In this paper a modern radar signal processing various stages have been studied. In modern radar sets the conversion of radar signals to digital form is typically accomplished after IF amplification and phase sensitive detection. The signal processor is that part of the system which separates targets from clutter on the basis of Doppler content and amplitude characteristics. All measures are referred to as radar signal processing and radar data processing that calculates from the received echo signal an evaluable image on the radar display. With radar signal processing is referred the part that is still dealing with the analogue (or often present even as a digital value) magnitude of the echo signal.
We present a cloud-enhanced, four-wheeled, mobile minirobot, assembled from low-cost, off-the-she... more We present a cloud-enhanced, four-wheeled, mobile minirobot, assembled from low-cost, off-the-shelf hardware parts, and open-source software building blocks. These building blocks provide a reusable & extensible base for emerging applications mixing robotics with the Internet of Things.
— This paper provides an overview of the Internet of Things (IoT) with emphasis on enabling techn... more — This paper provides an overview of the Internet of Things (IoT) with emphasis on enabling technologies, protocols, and application issues. The IoT is enabled by the latest developments in RFID, smart sensors, communication technologies, and Internet protocols. The basic premise is to have smart sensors collaborate directly without human involvement to deliver a new class of applications. The current revolution in Internet, mobile, and machine-to-machine (M2M) technologies can be seen as the first phase of the IoT. In the coming years, the IoT is expected to bridge diverse technologies to enable new applications by connecting physical objects together in support of intelligent decision making. This paper starts by providing a horizontal overview of the IoT. Then, we give an overview of some technical details that pertain to the IoT enabling technologies, protocols, and applications. Compared to other survey papers in the field, our objective is to provide a more thorough summary of the most relevant protocols and application issues to enable researchers and application developers to get up to speed quickly on how the different protocols fit together to deliver desired functionalities without having to go through RFCs and the standards specifications. We also provide an overview of some of the key IoT challenges presented in the recent literature and provide a summary of related research work. Moreover, we explore the relation between the IoT and other emerging technologies including big data analytics and cloud and fog computing. We also present the need for better horizontal integration among IoT services. Finally, we present detailed service use-cases to illustrate how the different protocols presented in the paper fit together to deliver desired IoT services.
— while searching for information about cloud computing over the internet, I came across a comput... more — while searching for information about cloud computing over the internet, I came across a computing keyword that is Green Computing. After so much of research on this term I myself tried to have the clear vision about it. This paper will help the researchers and administrators to have a clear understanding of Green Computing. In this modern era of globalization, computers play a vital role in every field, so the needs of computers increased day by day and, to full fill this need, a large amount of electricity required for manufacturing of computers functional units, such as CPU, memory, monitors and peripheral devices etc. By this a large amount of carbon content consumes in the atmosphere. These carbon contents directly or indirectly impact us and responsible for hazards on the earth. However, the growing demand of Cloud infrastructure has drastically increased the energy consumption of data centres, which has become a critical issue. High energy consumption not only translates to high operational cost, which reduces the profit margin of Cloud providers, but also leads to high carbon emissions which is not environmentally friendly. There are many technique used by intelligent people to overcome this problem, one of that technique is Green computing. Green Computing takes responsibility of eco-friendly use of computer and their resources. In order to design such solutions, deep analysis of Cloud is required with respect to their power efficiency. Mainly, IT industries are responsible for 3% of the world's energy consumption with an increase of 20% per year. The main goal of Green computing is to increase the efficiency of the IT products during its lifecycle and endorse the recycling of useless goods and factory waste. This paper introduces some energy efficient techniques of green computing and their use in future.
— In numerous parts of computing, there has been a continuous issue between the centralization an... more — In numerous parts of computing, there has been a continuous issue between the centralization and decentralization aspect which prompted to move from mainframes to PCs and local networks in the past, and union of services and applications in clouds and data centers. The expansion of technological advances such as high capacity mobile end-user devices, powerful dedicated connection boxes deployed in most homes, powerful wireless networks, and IoT (Internet of Things) devices along with developing client worries about protection, trust and independence calls for handling the information at the edge of the network. This requires taking the control of computing applications, information and services away from the core to the other the edge of the Internet. Relevance of cloud computing to mobile networks is on an upward spiral. Edge computing can possibly address the concerns of response time requirement, bandwidth cost saving, elastic scalability, battery life constraint, QoS, etc. MEC additionally offers, high bandwidth environment, ultra-low latency that gives real-time access to radio networks at the edge of the mobile network. Currently, it is being used for enabling on-demand elastic access to, or an interaction with a shared pool of reconfigurable computing resources such as servers, peer devices, storage, applications, and at the edge of the wireless network in close proximity to mobile users. It overcomes obstacles of traditional central clouds by offering wireless network information and local context awareness as well as low latency and bandwidth conservation. In this paper, we introduce edge computing and edge cloud, followed by why do we need edge computing, its classifications, various frameworks, applications and several case studies. Finally, we will present several challenges, concerns and future scope in the field of edge computing. Index Terms— Mobile Edge Computing (MEC), Internet of Things (IoT) —————————— ——————————
— The purpose of this paper is to do comparison of study of traffic handling efficiency of STM 16... more — The purpose of this paper is to do comparison of study of traffic handling efficiency of STM 16 devices between Huawei OptiX 3500 and Tejas TJ 1400.The present study will compare the efficiency of both the products. The development in transmission systems like MADM and DWDM is going towards higher bandwidth handling capacity of traffic. The development of mobile generation from 1G to 5G, requirement of higher bandwidth for lease-line, internet broadband and NGN, includes the requirement of higher bandwidth/traffic handling capacity of transmission system. All latest development in the field of PSTN, broadband and mobile needs higher bandwidth to meet the purpose of technology
A hot topic in today's context, FPGA implementation of Internet of Things (IOT) is recently intro... more A hot topic in today's context, FPGA implementation of Internet of Things (IOT) is recently introduced technique. The Internet of things is connecting people and smart devices on a scale that was once unimaginable. One more challenge for IOT is to handle vast amount of sensing the data generated from smart devices that are resource limited and subject to missing data due to link failures. By implementing IOT on FPGA platform, we present a concept in this paper, i.e. the use of low cost FPGA implementation of entire IOT subset including TCP/IP protocol, ControlSystem, Data Acquisition etc. The IOT applications on FPGA platform have received significant attention from the research community in the past few years. This technique offers a complete, low cost, powerful and user-friendly way of 24 hours real-time monitoring and remote sensing system. The main aim of this research is to highlight how the users can access the FPGA based design resources from anywhere. Thus we present a concept that shortens the application of momentarily unused resources for executing various tasks automatically.