Cloud Research Papers - Academia.edu (original) (raw)
An intrusive bioptical probe provides radial profiles of the PDF (Probability Density Function) of the bubble, the cloud and the emulsion phases behaviors in a fluidized bed of GeldartA particles. The fluidizing velocity ranges between... more
An intrusive bioptical probe provides radial profiles of the PDF (Probability Density Function) of the bubble, the cloud and the emulsion phases behaviors in a fluidized bed of GeldartA particles. The fluidizing velocity ranges between 0.20 m/s and 0.80 m/s. The bed porosity, the overall averaged bubble chords and velocities are wellpredicted by correlations of the literature. The cloud thickness is closely related to the bubble chord. The amount of gas traveling through the bubble phase never exceeds 1520 % of the total gas flow rate. The averaged porosity of the bubble, the cloud and the emulsion phases is of 8892 %, 6570 % and 5055 %, respectively. The reduced bubble porosity distribution obeys a constant normal law over the fluidizing velocity range. Each distribution of the reduced bubble chord and velocity, and of the reduced cloud thickness obey a constant log normal law over the fluidizing velocity range. Porosity threshold between the cloud and the bubble phases (m 3 /m 3 ).
Cloud servers is a platform for enabling convenient, on demand network access to a shared pool of configurable server resources (memory, networks, storage, cpu, applications, and services) that can be rapidly provisioned and released with... more
Cloud servers is a platform for enabling convenient, on demand network access to a shared pool of configurable server resources (memory, networks, storage, cpu, applications, and services) that can be rapidly provisioned and released with minimal management effort or cloud service provider interactions. Cloud servers are mostly being used; however, data security is one of the major barriers to adoption in cloud storage. Users can store data and used on demand or for the applications without keeping any local copy of the data on there machine. The Cloud servers storage technologies offers the promise of massive cost savings combined with increased IT agility due to pay per consume. However, this technology challenges many traditional approaches to hosting provider and enterprise application design and management. Users can able to upload data on cloud storage without worrying about to check or verify the integrity. Hence integrity auditing for cloud data is more important task to ens...
In last two decades continues increase of comput-ational power and recent advance in the web technology cause to provide large amounts of data. That needs large scale data processing mechanism to handle this volume of data. MapReduce is a... more
In last two decades continues increase of comput-ational power and recent advance in the
web technology cause to provide large amounts of data. That needs large scale data
processing mechanism to handle this volume of data. MapReduce is a programming model
for large scale distributed data processing in an efficient and transparent way. Due to its
excellent fault tolerance features, scalability and the ease of use. Currently, there are several
options for using MapReduce in cloud environments, such as using MapReduce as a service,
setting up one’s own MapReduce cluster on cloud instances, or using specialized cloud
MapReduce runtimes that take advantage of cloud infrastructure services. Cloud computing
has recently emerged as a new paradigm that provide computing infrastructure and large
scale data processing mechanism in the network. The cloud is on demand, scalable and high
availability so implement of MapReduce on the top of cloud services cause faster, scalable
and high available MapReduce framework for large scale data processing. In this paper we
explain how to implement MapReduce in the cloud and also have a comparison between
implementations of MapReduce on AzureCloud, Amazon Cloud and Hadoop at the end.
A main challenge for service providers is managing service-level agreements (SLAs) with their customers while satisfying their business objectives, such as maximizing profits. Most current systems fail to consider business objectives and... more
A main challenge for service providers is managing service-level agreements (SLAs) with their customers while satisfying their business objectives, such as maximizing profits. Most current systems fail to consider business objectives and thus to provide a complete SLA management solution. This work proposes an SLA-driven management solution that aims to maximize the provider's profit by reducing resource costs as well as fines owning to SLA violations. Specifically, this work proposes a framework that comprises multiple, configurable control loops and supports automatically adjusting service configurations and resource usage in order to maintain SLAs in the most costeffective way. The framework targets services implemented on top of large-scale distributed infrastructures, such as clouds. Experimental results demonstrate its effectiveness in maintaining SLAs while reducing provider costs.
The Cloud Computing concept offers dynamically scalable resources provisioned as a service over the Internet. Economic benefits are the main driver for the Cloud, since it promises the reduction of capital expenditure (CapEx) and... more
The Cloud Computing concept offers dynamically scalable resources provisioned as a service over the Internet. Economic benefits are the main driver for the Cloud, since it promises the reduction of capital expenditure (CapEx) and operational expenditure (OpEx). In order for this to become reality, however, there are still some challenges to be solved. Amongst these are security and trust issues, since the user's data has to be released to the Cloud and thus leaves the protectionsphere of the data owner. Most of the discussions on this topics are mainly driven by arguments related to organisational means. This paper focusses on technical security issues arising from the usage of Cloud services and especially by the underlying technologies used to build these cross-domain Internet-connected collaborations.
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects... more
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e-business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
A Cloud Computing system is intended to improve and automate the controlling single point operations. By using a single point of control, this goal is accomplished through the elimination of duplicate entry and the contribution of data... more
A Cloud Computing system is intended to improve and automate the controlling single point operations. By using a single point of control, this goal is accomplished through the elimination of duplicate entry and the contribution of data integrity, detailed drilldown, simple training, manageable support, minimal IT maintenance, easy upgrades and reduced costs. Overall, the advantages of cloud computing usage fulfill the original intentions of business as it allows process manufacturers to manage their business as simply and efficiently as possible. Enterprise Resource Planning (ERP) software is designed to improve and auto-mate business processes operations. However, there are many unnecessary administrative, procedural costs and delays often associated with this practice. Examples include duplicate data entry, data corruption, increased training, complicated supplier relations, greater IT support and software incompatibilities. Purpose of this system is Single Point of Control, Duplicate Entry Elimination, Data Integrity, Detail Drill Down, Basic Training, Manage Support, Security, Minimal IT Maintenance, Easy Upgrades, and Reduce Costs etc.
The cloud is the best method used for the utilization and organization of data. The cloud provides many resources for us via the internet. There are many technologies used in cloud computing systems; each one uses a different kind of... more
The cloud is the best method used for the utilization and organization of data. The cloud provides many resources for us via the internet. There are many technologies used in cloud computing systems; each one uses a different kind of protocols and methods. Many tasks can execute on different servers per second, which cannot execute on their computer. The most popular technologies used in the cloud system are Hadoop, Dryad, and another map reducing framework. Also, there are many tools used to optimize the performance of the cloud system, such as Cap3, HEP, and Cloudburst. This paper reviews in detail the cloud computing system, its used technologies, and the best technologies used with it according to multiple factors and criteria such as the procedure cost, speed cons and pros. Moreover, A comprehensive comparison of the tools used for the utilization of cloud computing systems is presented.
Cloud computing provides IT services to users worldwide, Data centers in Clouds consume large amount of Energy leading to highly effective costs. Therefore green energy computing is solution for decreasing operational costs. This survey... more
Cloud computing provides IT services to users worldwide, Data centers in Clouds consume large amount of Energy leading to highly effective costs. Therefore green energy computing is solution for decreasing operational costs. This survey presents efficient resource allocation and Scheduling algorithm/Techniques analyzed on different network parameters without compromising network performance and SLA constraints. Results are analyzed on different measures, providing a significant cost saving and improvement in Energy Efficiency.
Este estudo tem como objetivo explorar como a computação em nuvem (CN) está sendo desenvolvida no contexto acadêmico brasileiro. Para isso foi construída uma revisão da literatura buscando artigos, dissertações e teses acerca do tema com... more
Este estudo tem como objetivo explorar como a computação em nuvem (CN) está sendo desenvolvida no contexto acadêmico brasileiro. Para isso foi construída uma revisão da literatura buscando artigos, dissertações e teses acerca do tema com o intuito de mapear o estado da arte desse fenômeno no contexto nacional. Justifica-se a realização dessa pesquisa no intuito de mapear um fenômeno raro ou emergencial para auxiliar futuros pesquisadores que atuarão com estudos em CN. Por meio desse artigo é possível evidenciar que as pesquisas nacionais ainda estão em fase embrionária com tópicos voltados para a adoção da tecnologia. Sugerem-se estudos futuros que ainda não foram trabalhados com a CN no nível organizacional, de governança e possíveis desenvolvimentos de modelos que possam ser utilizados no contexto nacional, dado ao uso da tecnologia possuir particularidades específicas limitadas a legislações, aspectos culturais de usuários e formas de gestão.
There is a growing requirement for Internet of Things (IoT) infrastructure to ensure low response time to provision latency-sensitive real-time applications such as health monitoring, disaster management, and smart homes. Fog computing... more
There is a growing requirement for Internet of Things (IoT) infrastructure to ensure low response time to provision latency-sensitive real-time applications such as health monitoring, disaster management, and smart homes. Fog computing offers a means to provide such requirements, via a virtualized intermediate layer to provide data, computation, storage, and networking services between Cloud datacenters and end users. A key element within such Fog computing environments is resource management. While there are existing resource manager in Fog computing, they only focus on a subset of parameters important to Fog resource management encompassing system response time, network bandwidth, energy consumption and latency. To date no existing Fog resource manager considers these parameters simultaneously for decision making, which in the context of smart homes will become increasingly key. In this paper, we propose a novel resource management technique (ROUTER) for fog-enabled Cloud computing environments, which leverages Particle Swarm Optimization to optimize simultaneously. The approach is validated within an IoT-based smart home automation scenario, and evaluated within iFogSim toolkit driven by empirical models within a small-scale smart home experiment. Results demonstrate our approach results a reduction of 12% network bandwidth, 10% response time, 14% latency and 12.35% in energy consumption.
Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business... more
Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business ethics dealing with this new technology. It analyzes the informational duties of hosting companies that own and operate cloud computing datacentres (e.g., Amazon). It considers the cloud services providers leasing ‘space in the cloud’ from hosting companies (e.g., Dropbox, Salesforce). And it examines the business and private ‘clouders’ using these services. The first part of the paper argues that hosting companies, services providers and clouders have mutual informational (epistemic) obligations to provide and seek information about relevant issues such as consumer privacy, reliability of services, data mining and data ownership. The concept of interlucency is developed as an epistemic virtue governing ethically effective communication. The second part considers potential forms of government restrictions on or proscriptions against the development and use of cloud computing technology. Referring to the concept of technology neutrality, it argues that interference with hosting companies and cloud services providers is hardly ever necessary or justified. It is argued, too, however, that businesses using cloud services (e.g., banks, law firms, hospitals etc. storing client data in the cloud) will have to follow rather more stringent regulations.
The leading-edge of Internet of Things (IoT) gradually make item available on the Internet but data processing is not scaling effectively to fulfil the requirements of centralized cloud environment. One of the main reason of this problem... more
The leading-edge of Internet of Things (IoT) gradually make item available on the Internet but data processing is not scaling effectively to fulfil the requirements of centralized cloud environment. One of the main reason of this problem is that deadline oriented cloud applications such as health monitoring, flight control system and command control system, which needs minimum latency and response time originated by transmission of large amount of data (Big Data) to centralized database and then database to an IoT application or end device which leads to performance degradation. Fog computing is an innovative solution to reduce the delay (or latency), resource contention and network congestion, in which cloud is extended to the edge of the network. We proposed a fog-assisted information model in this paper, which delivers healthcare as a cloud service using IoT devices. Further, proposed model efficiently manages the data of heart patients, which is coming through their user requests. iFogSim toolkit is used to analyse the performance of proposed model in Fog-enabled cloud environment.
— The ever-increasing amount of personal or sensitive data stored in a cloud data storage needs to be protected, since losing it is a very serious problem. As their popularity increases, cloud storage is becoming an option for user in... more
— The ever-increasing amount of personal or sensitive data stored in a cloud data storage needs to be protected, since losing it is a very serious problem. As their popularity increases, cloud storage is becoming an option for user in keeping their data online, it poses a lot of security threats and the challenges of protecting their data from being hacked. Recent successful attacks on cloud storage provider has implemented many security controls as a security measure, such as two (2)-factor authentication. The study aims to determine the effectiveness of the said authentication method based on the algorithm used, discuss its potential benefits and weaknesses, and provide other security measures.
In this paper, we propose an intelligent and secure internet of things approach for the healthcare system that monitors the patient heart rate in real-time and from any place. Thanks to the agent, the proposed system can predict the... more
In this paper, we propose an intelligent and secure internet of things approach for the healthcare system that monitors the patient heart rate in real-time and from any place. Thanks to the agent, the proposed system can predict the critical condition before it even happens and takes fast and apt decisions in an emergency case. Based on the experimentation, the proposed system is convenient, reliable, and ensures data security at a low cost. The proposed algorithm outperforms other algorithms regarding the system's operational efficiency. It is more suitable for devices with power, storage, and processing limitations, such as in IoT devices. Also, agents are the better current technologies for heterogeneous and distributed systems, such as the internet of things. Moreover, this approach's scalability makes it suitable for a broad range of IoT environments, including smart homes, smart cities, dynamic and large-area networks, etc.
OpenNebula Means Openness, Excellence, Cooperation and Innovation resource pooling, and dynamic provisioning. Using this Develop a simplest, powerful, highly-scalable and adaptable solution for building and managing virtualized data... more
OpenNebula Means Openness, Excellence, Cooperation and Innovation resource pooling, and dynamic provisioning. Using this Develop a simplest, powerful, highly-scalable and adaptable solution for building and managing virtualized data centers and enterprise clouds. Provide cloud builders and integrators with a modular system that can implement a variety of cloud architectures and can interface with any datacenter service. Its Provide cloud developers and users with choice of cloud and system interfaces, from open cloud to de-facto standards, to support the creation of a rich ecosystem of higher level components. OpenNebula is the open-source industry standard for data center virtualization, offering a simple but feature rich and flexible solution to build and manage enterprise clouds and virtualized data centers. OpenNebula is designed to be simple. Simple to install, update and operate by the admins, and simple to use by end users. Being focused on simplicity, we integrate with existing technologies whenever possible.
In recent years, the Internet of Things (IoT) has become well known. It has attracted much attention thanks to its powerful application in different fields. Currently, billions of IoT devices are connected. The massive rise in the number... more
In recent years, the Internet of Things (IoT) has become well known. It has
attracted much attention thanks to its powerful application in different fields. Currently, billions of IoT devices are connected. The massive rise in the number of connected devices instantly produces further data that requires high storage and processing technology. Besides, the IoT equipment has limited processing, and storage capability did not allow data processing or even storage. Also, things cannot make smart and quick decisions to achieve their goals, and the thing ability to communicate with other things depends on service similarity and communication protocols; this will impact the performance, efficiency, development, and scalability of IoT applications. This thesis provides three main contributions, in the first contribution, we propose a Cloud-IoT approach to provide an efficient solution for the storage and analysis of data generated by IoT applications and improve IoT scalability. In order to demonstrate the viability of the proposed approach, a Cloud-IoT system for remote patient monitoring is developed.
IoT security becomes a more challenging problem due to the computing and communication limits of IoT devices make IoT more vulnerable to security and privacy attacks. Hence, as a second contribution, we propose a robust solution based on lightweight cryptography to ensure data security. Also, we have used the access control mechanism to ensure data privacy. The experimentation demonstrates that the proposed solution more suitable for IoT devices. To tap the most out of massive data streams, IoT needs real-time management and analysis of data. In order to enhance the intelligence of IoT devices and to improve our first approach, we have developed an efficient, autonomous, and real-time solution based on the agent for data management and analysis as a third contribution. The proposed system can make fast and apt decisions in an emergency case.
Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and... more
Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer or remote server. Realization of the cloud computing infrastructure requires access to data anywhere, anytime at any device at a sufficient perceived quality of service.The running of the enterprise's data center is just like Internet. This makes the enterprise use the resource in the application that is needed, and access computer and storage system according to the requirement. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services.This article introduces the background and principle of cloud computing, the character, style and actuality. Cloud computing enables the externalization of software resources at a very large scale for residential users. The fields of application of Cloud computing seems almost unlimited, all the sectors of our economy being concerned. Applications are ranging from distant medical diagnostic, collaborative image processing, scientific computation, financial operations, and industrial processes to radio astronomy already financially benefiting from advantages of Clouds.
Cloud computing is growing significantly day by day, and has created a shift in the IT industry. It brings several advantages to consumers by delivering computing as a service, such as self-service, on-demand, broad network access, rapid... more
Cloud computing is growing significantly day by day, and has created a shift in the IT industry. It brings several advantages to consumers by delivering computing as a service, such as self-service, on-demand, broad network access, rapid elasticity, resource pooling, and measured service. On the other hand, organizations have many fears when deciding to migrate to cloud computing, due to the many risks they are exposed to by the cloud environment. Risk management is an appropriate method to identify, assess, and manage risks. The traditional risk management frameworks do not fit cloud computing well due to its complex environment and the assumption by those frameworks that the assets are owned and fully managed by the organization itself. The aim of this paper is to propose a new security risk management framework for the cloud computing environment. The framework focuses on the two main issues that affect the effectiveness of risk management: loss of control and the multi-tenant architecture of cloud computing. The proposed framework has been evaluated on a hypothetical cloud computing provider to present the step by step progress of the proposed framework.
Human physique have an unsteady bio-medical formation attached with various joints even a normal issue in the lower joints harmfully affects the management of standing position. The projected system includes an observing circuit which is... more
Human physique have an unsteady bio-medical formation attached with various joints even a normal issue in the lower joints harmfully affects the management of standing position. The projected system includes an observing circuit which is inbuilt and gives footstep count, weight count, walking speed, travel mapping, distance count, Pro-health tips. It is proposed with a load cell pressure sensor which will turn off the system to ignore unwanted usage of battery. This system includes a mobile application that describes overall bio-medical information. This method is enforced by putting GEO locating, real-time cloud server microprocessors for real-time data assortment and health monitoring mobile application into services.
Cloud computing provides a flexible and convenient way for data sharing, which brings various benefits for both the society and individuals. But there exists a natural resistance for users to directly outsource the shared data to the... more
Cloud computing provides a flexible and convenient way for data sharing, which brings various benefits for both the society and individuals. But there exists a natural resistance for users to directly outsource the shared data to the cloud server since the data often contain valuable information.
Billing in Infrastructure as a Service (IaaS) is a complex task that is often compromised due to lack of various interactions (Consumer, Provider, Budget Manager, Agent and Online Payment) simultaneously along with necessary features like... more
Billing in Infrastructure as a Service (IaaS) is a complex task that is often compromised due to lack of various interactions (Consumer, Provider, Budget Manager, Agent and Online Payment) simultaneously along with necessary features like discount, tax, plan etc. Until Billing is considered as a last essential ability, appropriate framework cannot be provided to IaaS cloud consumers. In this paper, IaaS Online Billing System (IOBS) has been proposed that describes transparency of consumption, billing and frequency of usage of services for a cloud based pay per use system. This system describes various interactions of network and user interface. The billing rules have been stored in a database. The model has been verified through UML that demonstrates that IOBS is effective in improving user interaction by reducing time and increasing customer satisfaction.
Serverless computing has rapidly grown following the launch of Amazon's Lambda platform. Function-as-a-Service (FaaS) a key enabler of serverless computing allows an application to be decomposed into simple, standalone functions that are... more
Serverless computing has rapidly grown following the launch of Amazon's Lambda platform. Function-as-a-Service (FaaS) a key enabler of serverless computing allows an application to be decomposed into simple, standalone functions that are executed on a FaaS platform. The FaaS platform is responsible for deploying and facilitating resources to the functions. Several of today's cloud applications spread over heterogeneous connected computing resources and are highly dynamic in their structure and resource requirements. However, FaaS platforms are limited to homogeneous clusters and homogeneous functions and do not account for the data access behavior of functions before scheduling. We introduce an extension of FaaS to heterogeneous clusters and to support heterogeneous functions through a network of distributed heterogeneous target platforms called Function Delivery Network (FDN). A target platform is a combination of a cluster of homogeneous nodes and a FaaS platform on top of it. FDN provides Function-Delivery-as-a-Service (FDaaS), delivering the function to the right target platform. We showcase the opportunities such as varied target platform's characteristics, possibility of collaborative execution between multiple target platforms, and localization of data that the FDN offers in fulfilling two objectives: Service Level Objective (SLO) requirements and energy efficiency when scheduling functions by evaluating over five distributed target platforms using the FDNInspector, a tool developed by us for benchmarking distributed target platforms. Scheduling functions on an edge target platform in our evaluation reduced the overall energy consumption by 17× without violating the SLO requirements in comparison to scheduling on a high-end target platform.
This research paper describes the results oriented from experimental study of conventional document clustering techniques implemented in the commercial spaces so far. Particularly, we compared main approaches related to document... more
This research paper describes the results oriented from experimental study of conventional document clustering techniques implemented in the commercial spaces so far. Particularly, we compared main approaches related to document clustering, agglomerative hierarchical document clustering and K-means. Though this paper, we generates and implement checker's algorithms which deals with the duplicacy of the document content with the rest of the documents in the cloud. We also generate algorithm required to deals with the classification of the cloud data. The classification in this algorithm is done on the basis of the date of data uploaded and. We will take the ratio of both vectors and generate a score which rates the document in the classification.
- by GRD JOURNALS and +1
- •
- Algorithms, Plagiarism, Clustering, Classification
Water is an important constituent on this earth for every living being whether it is human, animal, plants or any micro-organism. Everyone relies on water. Human body is made up of 60-70% of water. Only 2.5% of fresh water is present on... more
Water is an important constituent on this earth for every living being whether it is human, animal, plants or any micro-organism. Everyone relies on water. Human body is made up of 60-70% of water. Only 2.5% of fresh water is present on earth out of which, 1.75% is in the form of glaciers and ice caps. In the past there was water in almost every part of the world but now, as the population is increasing with time, consumption of water is more and this will, in return, decrease the water resources like reservoirs and rivers. This is creating a major threat to the living and have made scientists and researchers to find ways to increase the rate of precipitation especially in those areas in which water is really scarce. One method is cloud seeding which is the main topic of discussion of this research paper.
The purpose of the research is to study about cloud seeding or artificial rain, understand its working mechanism and identify its pros and cons, in which areas and regions is it being implemented, what is its capital and operational cost. The aspects that we have discussed in our research includes the capital and maintenance cost of cloud seeding project, geological suitability and countries where it is successfully implemented and where it cannot be implemented and the reason why it is not suitable for every country to adopt and eliminate their water requirement.
There is no assurity that cloud seeding really increases the precipitation as number of questions by researchers have turned down many experimental results.
- by Abdul Moeez Mahmood and +2
- •
- Artificial Intelligence, Artificial Life, Cloud, Rainfall
The distributed computing is done on many systems to solve a large scale problem. The growing of high-speed broadband networks in developed and developing countries, the continual increase in computing power, and the rapid growth of the... more
The distributed computing is done on many systems to solve a large scale problem. The growing of high-speed broadband networks in developed and developing countries, the continual increase in computing power, and the rapid growth of the Internet have changed the way. In it the society manages information and information services. Historically, the state of computing has gone through a series of platform and environmental changes. Distributed computing holds great assurance for using computer systems effectively. As a result, supercomputer sites and datacenters have changed from providing high performance floating point computing capabilities to concurrently servicing huge number of requests from billions of users. The distributed computing system uses multiple computers to solve large-scale problems over the Internet. It becomes data-intensive and network-centric. The applications of distributed computing have become increasingly wide-spread.
- by Riya Patel
- •
- Computer Networks, P2P, Cloud, Jungle
Cloud computing is envisioned as the next generation architecture for IT Enterprises, and has proliferated itself due to the advantages it provides.Cloud computing provides solutions for carrying out efficient, scalable and low cost... more
Cloud computing is envisioned as the next generation architecture for IT Enterprises, and has proliferated
itself due to the advantages it provides.Cloud computing provides solutions for carrying out efficient, scalable and low cost computing. The pay per usage concept of Cloud computing increases the resource utilisation of a vendor’s computing power and resources; at the same time,it results in reduced hardware costs forits users. It also provides access mobility, easier maintenance, scalabilityand operability in terms
of its management and usage. Because of the facilities and solutions it provides to the industry for the next generation computing, it is vulnerable to a variety of known and unknown attacks from attackers.Hence, securinga Cloud environment is a critical problem that needs urgent attention.This articlefocuses on a taxonomy of possible attacks on a Cloud environment and a taxonomy of the defence .The attack taxonomy describes existing threats on Cloud security, and the defence taxonomy gives a classification of the various counter measures that can be taken to protect the Cloud environment from such attacks. The aim of this article is to provideresearchers, academicians and industry with a better understanding of existing attacks and defence mechanisms on Cloud security.This is to provide a clearvision of the challenges that should be worked onto ensure next generation security forCloud computing
Secure data access and effective data sharing in public cloud is an problem to solve. This paper focuses on the secure data sharing and storage. It will provide high security and efficiency. In this mechanism, group members can... more
Secure data access and effective data sharing in public cloud is an problem to solve. This paper focuses on the secure data sharing and storage. It will provide high security and efficiency. In this mechanism, group members can communicate with each other anonymously. Group members will use a common conference key to share and store data securely on cloud. Note that: An elliptic curve cryptography is utilized for secured key generation.
- by IJIRMPS International Journal and +2
- •
- Cryptography, Cloud, Cloud Security
This paper studies a fourth emerging service model of cloud computing, Business Process as a Service (BPaaS). BPaaS is generating hype but it is under researched with a lack of clarity regarding how BPaaS differentiates itself from other... more
This paper studies a fourth emerging service model of cloud computing, Business Process as a Service (BPaaS). BPaaS is generating hype but it is under researched with a lack of clarity regarding how BPaaS differentiates itself from other service models and what opportunities and challenges it presents for management and organisations attempting to implement it. In this paper we examine the theoretical underpinnings of BPaaS with a view to developing an understanding of the potential value of adopting the BPaaS model and its implications for the realisation of business value from cloud computing. We develop the SEIFS framework (Situation, Elemental, Integration, Flexibility, and Scalability) to provide guidance in the classification of cloud services as BPaaS or otherwise and attempt to validate this framework by using a series of case studies based in Irish cloud computing firms. Our paper aims to contribute to the literature by addressing confusion in the classification of cloud services, and specifically BPaaS. We also provide insights for prospective BPaaS providers to factor into their service design and discuss how BPaaS can provide a competitive advantage. Finally, we identify and discuss a number of areas for future research.
Resource scheduling in cloud is a challenging job and the scheduling of appropriate resources to cloud workloads depends on the QoS requirements of cloud applications. In cloud environment, heterogene-ity, uncertainty and dispersion of... more
Resource scheduling in cloud is a challenging job and the scheduling of appropriate resources to cloud workloads depends on the QoS requirements of cloud applications. In cloud environment, heterogene-ity, uncertainty and dispersion of resources encounters problems of allocation of resources, which cannot be addressed with existing resource allocation policies. Researchers still face troubles to select the efficient and appropriate resource scheduling algorithm for a specific workload from the existing literature of resource scheduling algorithms. This research depicts a broad methodical literature analysis of resource management in the area of cloud in general and cloud resource scheduling in specific. In this survey, standard methodical literature analysis technique is used based on a complete collection of 110 research papers out of large collection of 1206 research papers published in 19 foremost workshops, symposiums and conferences and 11 prominent journals. The current status of resource scheduling in cloud computing is distributed into various categories. Methodical analysis of resource scheduling in cloud computing is presented, resource scheduling algorithms and management , its types and benefits with tools, resource scheduling aspects and resource distribution policies are described. The literature concerning to thirteen types of resource scheduling algorithms has also been stated. Further, eight types of resource distribution policies are described. Methodical analysis of this research work will help researchers to find the important characteristics of resource scheduling algorithms and also will help to select most suitable algorithm for scheduling a specific workload. Future research directions have also been suggested in this research work.
The Internet is an existing thing, always changing and growing. New applications and businesses are created endlessly. Besides to developing Internet, technology is changing the backdrop. Broadband connectivity is becoming cheap and... more
The Internet is an existing thing, always changing and growing. New applications and businesses are created endlessly. Besides to developing Internet, technology is changing the backdrop. Broadband connectivity is becoming cheap and omnipresent; devices are becoming more powerful and smaller with a variety of committed sensors. The abundance of more devices becoming connected is leading to a new prototype: the IoT. The IoTs is driven by an extension of the Internet through the addition of physical objects united with an ability to provide smarter services to the upbringing as more data becomes available. Various application domains like Smart Infrastructure, Healthcare, Security and Surveillance, Transportation, Retail, Industrial and Telecommunication etc., there are challenges associated with the IoTs, most explicitly in areas of trust and security, Privacy and Data Confidentiality. In this way, the things/objects are capable of recognizing events and changes in their surroundings and are acting and reacting autonomously.
Cloud computing is commonly used to represent any work done on a computer, mobile or any device, where the data and possibly the application being used do not reside on the device but rather on an unspecified device elsewhere on the... more
Cloud computing is commonly used to represent any work done on a computer, mobile or any device, where the data and possibly the application being used do not reside on the device but rather on an unspecified device elsewhere on the Internet. The basic premise of cloud computing is that consumers (individuals, industry, government, academia and so on) pay for IT services from cloud service providers (CSP). Services offered in cloud computing are generally based on three standard models (Infrastructure-as s service, Platform-as a service, and Software as a Service) defined by the National Institute of Standards and Technology (NIST). The reason for cloud existence is to resolve managing problems being faced for data that were excessively stored, either mandatory capacity was limited due to the infrastructure of the business, or large capacity that led to a wasted capital. Apart from those major factors such as the initial capital, capitals and the service-fix cost, the sophisticated effort for the patching, the managing and the upgrading of the internal infrastructure is a huge obstacle for firm‘s development and mobility. For many firms where client and cultural competency have not got the strength to manage large data center environments and infrastructure, it would be wise to upload their files or data backups to another machine via internet, in order to concentrate more on the organizations primary objectives.
- by xx yy and +1
- •
- Soft Computing, Cloud, IT
Advancement in information and communication technology (ICT) has given rise to explosion of data in every field of operations. Working with the enormous volume of data (or Big Data, as it is popularly known as) for extraction of useful... more
Advancement in information and communication technology (ICT) has given rise to explosion of data in every field of operations. Working with the enormous volume of data (or Big Data, as it is popularly known as) for extraction of useful information to support decision making is one of the sources of competitive advantage for organizations today. Enterprises are leveraging the power of analytics in formulating business strategy in every facet of their operations to mitigate business risk. Volatile global market scenario has compelled the organizations to redefine their supply chain management (SCM). In this paper, we have delineated the relevance of Big Data and its importance in managing end to end supply chains for achieving business excellence. A Big Data-centric architecture for SCM has been proposed that exploits the current state of the art technology of data management, analytics and visualization. The security and privacy requirements of a Big Data system have also been highlighted and several mechanisms have been discussed to implement these features in a real world Big Data system deployment in the context of SCM. Some future scope of work has also been pointed out.
Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business... more
Cloud computing is rapidly gaining traction in business. It offers businesses online services on demand (such as Gmail, iCloud and Salesforce) and allows them to cut costs on hardware and IT support. This is the first paper in business ethics dealing with this new technology. It analyzes the informational duties of hosting companies that own and operate cloud computing datacentres (e.g., Amazon). It considers the cloud services providers leasing 'space in the cloud' from hosting companies (e.g., Dropbox, Salesforce). And it examines the business and private 'clouders' using these services. The first part of the paper argues that hosting companies, services providers and clouders have mutual informational (epistemic) obligations to provide and seek information about relevant issues such as consumer privacy, reliability of services, data mining and data ownership. The concept of interlucency is developed as an epistemic virtue governing ethically effective communicatio...
Cloud computing is a style of computing in which massively scalable IT-enabled capabilities are delivered ‗as a service' to multiple customers using internet technologies. This clears that the cloud computing is a business rather than a... more
Cloud computing is a style of computing in which massively scalable IT-enabled capabilities are delivered ‗as a service' to multiple customers using internet technologies. This clears that the cloud computing is a business rather than a technical aspect. In it the information is stored in servers and provided as a service and on demand to clients. Cloud computing is about moving services, computation and/or data-for cost and business advantage-off-site to an internal or external, location-transparent, centralized facility or contractor. By making data available in the cloud, it can be more easily accessed, often at much lower cost, increasing its value by enabling opportunities for enhanced collaboration, integration, and analysis on a shared common platform. In this paper we study the three areas of cloud computing SaaS, PaaS and IaaS. Also the comparison of these services is there in the research paper based on different issues e.g. computer power, storage and networking etc.
Construction projects involve multidisciplinary and multi-actor collaborations that generate massive amounts of data over their lifecycle. Data are often sensitive, and embody rights, ownership and intellectual property of the creator.... more
Construction projects involve multidisciplinary and multi-actor collaborations that generate massive amounts of data over their lifecycle. Data are often sensitive, and embody rights, ownership and intellectual property of the creator. Managing project information raises concerns about security, inconsistency and loss of data. Conventional approach of dealing with the complexities of data management involves the adoption of BIM based solutions that lack suitable means for the governance of collaboration, and access and archival of managed data. To overcome the limitations of BIM, Cloud-based governance solutions have been suggested as a way forward. However, there is a lack of understanding of construction ICT (Information and Communication Technology) practices from the perspectives of data management and governance. This paper aims to fill this gap; first, by exploring barriers related to BIM adoption and collaboration practices, in particular, issues related to data management and governance that can potentially be ameliorated with Cloud technologies, and second, by identifying key requirements for Cloud-based BIM governance solutions. A structured questionnaire was conducted among informed construction practitioners in this study. The findings reveal several barriers to BIM adoption alongside ICT and collaboration issues with an urgent need to develop a BIM governance solution underpinned by cloud technology. Further, a number of important requirements for developing BIM governance solutions have been identified.
Cloud computing is a fast emerging paradigm for computing; that promises a greener, scalable, cost effective alternative to utilizing IT resources. Cloud solutions facilitate faster implementation of innovative ideas, thereby leveraging... more
Cloud computing is a fast emerging paradigm for computing; that promises a greener, scalable, cost effective alternative to utilizing IT resources. Cloud solutions facilitate faster implementation of innovative ideas, thereby leveraging businesses and changing the way we design and deliver software, as well as the business models used. This paradigm shifts processing and storage activities from the client-side to the data centers. This paper gives an overview of the concept and its core technologies, along with its underlying characteristics, service layers and infrastructure models. The latter part will discuss case-studies of a sampled list of cloud services and providers, benefits and impact on customers, user concerns, mass acceptability, current research and future trends.
The advent of Cloud Computing has altered the IT landscape. Hardware, software and the technical expertise required to provide and maintain IT services have been aggregated in the Data Centre. This has given rise to a new IT delivery... more
The advent of Cloud Computing has altered the IT landscape. Hardware, software and the technical expertise required to provide and maintain IT services have been aggregated in the Data Centre. This has given rise to a new IT delivery system resulting in significant benefits in the way of efficiency and cost saving to name but a few. There are obvious positives, which give cloud-computing legitimacy as a solution such as: • Economy of Scale (virtualization has dramatically increased hardware utilization meaning IT workloads can be packed densely into a smaller space. This results in economy of scale as more IT can be serviced with less resource.) • A Reduction in capital expenditure on infrastructure and less software licensing costs. • Pay for IT services on demand therefore giving greater cost efficiency. Marry expenditure on IT services to company cash flow (based on operational activity.) • Workforce distribution. If employees can access the organizations data and IT resources over the Internet there is potential for savings by paying local (potentially cheaper) rates and lessening overhead costs associated with having to provide dedicated office space in expensive locations. • Outsource technical IT expertise to the datacenter. Lessen the need to hire IT staff to directly to manage specific IT needs that can be provided for by a cloud service provider. Those are just some of the benefits, which make cloudcomputing such an attractive model. However, there are also significant challenges that have emerged over time as cloud services have been adopted by a wide range of clients from personal users, to commercial organizations to government agencies. These are issues relating to data security, privacy, regulatory compliance, auditing and assessment, etc.… All of which are brought on by what is a lessening in an organizations control of its IT resource when a move to the Cloud occurs. This is because of the distributed nature of Cloud Services in which an organizations data now physically resides off-premises and perhaps in a system whose boundaries are significantly blurred in comparison to a deployment on the more traditional client-server based IT model and where the expertise to manage that resource has been outsourced to a third party. One such challenge is in the area of legislation. This document aims to outline some of the jurisdictional issues that accompany adoption of Cloud Services by an organization or entity. We will investigate the current status in regard to the legal state of cloud computing and identify what's being done to legislate for the changing face of IT as a result of the wide scale uptake of Cloud based services. In particular, how does a system of law, which is traditionally applied, based on jurisdiction, apply to a public cloud solution that is essentially ethereal in its nature?
Cloud computing plays a critical role in modern society and enables a range of applications from infrastructure to social media. Such system must cope with varying load and evolving usage reflecting societies' interaction and dependency... more
Cloud computing plays a critical role in modern society and enables a range of applications from infrastructure to social media. Such system must cope with varying load and evolving usage reflecting societies' interaction and dependency on automated computing systems whilst satisfying Quality of Service (QoS) guarantees. Enabling these systems are a cohort of conceptual technologies, synthesized to meet demand of evolving computing applications. In order to understand current and future challenges of such system, there is a need to identify key technologies enabling future applications. In this study, we aim to explore how three emerging paradigms (Blockchain, IoT and Artificial Intelligence) will influence future cloud computing systems. Further, we identify several technologies driving these paradigms and invite international experts to discuss the current status and future directions of cloud computing. Finally, we proposed a conceptual model for cloud futurol-ogy to explore the influence of emerging paradigms and technologies on evolution of cloud computing.
- by Sukhpal Singh Gill and +1
- •
- Software Engineering, Climate Change, Machine Learning, Security