Cloud computing 2014 4 20 20069 (original) (raw)

Implementation of Image Processing System Using Big Data in the Cloud Environment

— Cloud computing is one of the emerging techniques to process the big data. Cloud computing is also, known as service on demand. Large set or large volume of data is known as big data. Processing big data (MRI images and DICOM images) normally takes more time. Hard tasks such as handling big data can be solved by using the concepts of hadoop. Enhancing the hadoop concept will help the user to process the large set of images. The Hadoop Distributed File System (HDFS) and MapReduce are the two default main functions which is used to enhance hadoop. HDFS is a hadoop file storing system, which is used for storing and retrieving the data. MapReduce is the combination of two functions namely map and reduce. Map is the process of splitting the inputs and reduce is the process of integrating the output of map's input. Recently, medical experts experienced problems like machine failure and fault tolerance while processing the result for the scanned data. A unique optimized time scheduling algorithm, called Dynamic Handover Reduce Function (DHRF) algorithm is introduced in the reduce function. Enhancement of hadoop and cloud and introduction of DHRF helps to overcome the processing risks, to get optimized result with less waiting time and reduction in error percentage of the output image.

Hadoop and Its Role in Modern Image Processing

Open Journal of Marine Science, 2014

This paper introduces MapReduce as a distributed data processing model using open source Hadoop framework for manipulating large volume of data. The huge volume of data in the modern world, particularly multimedia data, creates new requirements for processing and storage. As an open source distributed computational framework, Hadoop allows for processing large amounts of images on an infinite set of computing nodes by providing necessary infrastructures. This paper introduces this framework, current works and its advantages and disadvantages.

A Review on Hadoop MapReduce using image processing and cloud computing

2017

Hadoop is an open source framework that allows distributed processing of large data set across clusters of computers. Big data describes technology to capture, store, distribute and manage the large size data set. Data is generated by different sources such as from a web site or click streams (e.g. net ix, Face book, Google), Sensors (energy monitoring, application monitoring, telescope) and biomedical diagnosis. Image processing is perform important function in various research areas such as biomedical imaging, remote sensing, astronomy, internet etc. Hadoop image processing library is used with the apache Hadoop map reduce programming framework. Hadoop image processing provide facility to highthroughput image processing. Splitting digital image in to multiple segments using image segmentation technique which is used to identify the object and boundary of object in image processing. Images are stored in Hadoop distributed file system after that apply map reduce algorithm to extract...

Time Efficient Image Processing Framework using MapReduce Sanketh T*

2017

Paper introduces an effective processing framework nominated Image Cloud Processing (ICP) to powerfully cope with the data explosion in image processing field. While most previous researches focus on optimizing the image processing algorithms to gain higher efficiency, our work dedicates to providing a general framework for those image processing algorithms, which can be implemented in parallel so as to achieve a boost in time efficiency without compromising the results performance along with the increasing image scale to accomplish processing, two novel data representations named P-Image and Big-Image are designed to cooperate with MapReduce to achieve more optimized configuration and higher efficiency. It is implemented through a parallel processing procedure working with the traditional processing mechanism of the distributed system. Representative results of comprehensive experiments on the challenging ImageNet dataset are selected to validate the capacity of our proposed ICP fr...

Image Processing in Hadoop Distributed Environment

Satellite images are a growing resource of information and have many applications. In this research, the multispectral satellite images have been subjected to unsupervised classification based on K-Means clustering using Hadoop Framework, which is designed for big data processing, along with Hadoop Image Processing Interface (HIPI). We developed support for the GeoTIFF format, which is usually used for satellite images, and we will show that our methodology enhances the performance.

Processing Large Amounts of Images on Hadoop with OpenCV

2015

Modern image collections cannot be processed efficiently on one computer due to large collection sizes and high computational costs of modern image processing algorithms. Hence, image processing often requires distributed computing. However, distributed computing is a complicated subject that demands deep technical knowledge and often cannot be used by researches who develop image processing algorithms. The framework is needed that allows the researches to concentrate on image processing tasks and hides from them the complicated details of distributed computing. In addition, the framework should provide the researches with the familiar image processing tools. The paper describes the extension to the MapReduce Image Processing (MIPr) framework that provides the ability to use OpenCV in Hadoop cluster for distributed image processing. The modified MIPr framework allows the development of image processing programs in Java using the OpenCV Java binding. The performance testing of create...

Multimedia processing using deep learning technologies, high‐performance computing cloud resources, and Big Data volumes

Concurrency and Computation: Practice and Experience, 2020

SummaryThe last few years have been marked by the presence of very large sets of images and videos in our everyday lives. These multimedia objects have a very fast frequency of creation and sharing since images and videos can come from different devices such as smartphones, satellites, cameras, or drones. They are generally used to illustrate objects in different situations (public areas, train stations, hospitals, political and sport events and competitions, etc). As consequence, image and video processing algorithms have got increasing importance for several computer vision applications that should be adapted for managing large‐scale volumes and exploiting high performance computing resources (local or cloud). In this work, we propose a cloud‐based toolbox (platform) for computer vision applications. This platform integrates a toolbox of image and video processing algorithms that can (i) exploit high performance computing cloud resources, (ii) execute applications in real time, an...

A Big Image Data Distributed Processing Frame Work in Static and Dynamic Image Cloud Processing

2018

In this paper suggesting a functional processing frame work nominated Image Cloud processing (ICP) to powerfully subsist with the data in Image processing field. The image processing algorithms to increase greater proficiency, it focuses on giving a general structure to the algorithms that can be executed in parallel to achieve the goal The ICP framework consist of two processing systems i.e.; Static ICP (SICP) and Dynamic ICP (DICP). SICP is handling the image information pre-stored in the distributed system; DICP is initiate for dynamic information. To Manage SICP, two new data representations named as P-Image and Big-Image are implemented to coordinate with Map Reduce to attain more optimized marshalling and higher coherence. DICP is enact into a parallel processing schedule working with the standard technique of the proper framework. Image Net dataset are used to authorize the capacity of ICP framework over the traditional state-of-the-art methods, both in time efficiency and qu...

Cloud Hadoop Map Reduce For Remote Sensing Image Analysis

Image processing algorithms related to remote sensing have been tested and utilized on the Hadoop MapReduce parallel platform by using an experimental 112-core high-performance cloud computing system that is situated in the Environmental Studies Center at the University of Qatar. Although there has been considerable research utilizing the Hadoop platform for image processing rather than for its original purpose of text processing, it had never been proved that Hadoop can be successfully utilized for high-volume image files. Hence, the successful utilization of Hadoop for image processing has been researched using eight different practical image processing algorithms. We extend the file approach in Hadoop to regard the whole TIFF image file as a unit by expanding the file format that Hadoop uses. Finally, we apply this to other image formats such as the JPEG, BMP, and GIF formats. Experiments have shown that the method is scalable and efficient in processing multiple large images used mostly for remote sensing applications, and the difference between the single PC runtime and the Hadoop runtime is clearly noticeable.