Grid Computing in 3DEM Image Processing Using Xmipp (original) (raw)

Large-scale biomedical image analysis in grid environments

2008

This paper presents the application of a componentbased Grid middleware system for processing extremely large images obtained from digital microscopy devices. We have developed parallel, out-of-core techniques for different classes of data processing operations employed on images from confocal microscopy scanners. These techniques are combined into a data preprocessing and analysis pipeline using the component-based middleware system. The experimental results show that: 1) our implementation achieves good performance and can handle very large datasets on high-performance Grid nodes, consisting of computation and/or storage clusters and 2) it can take advantage of Grid nodes connected over high-bandwidth wide-area networks by combining task and data parallelism.

Exploiting desktop supercomputing for three-dimensional electron microscopy reconstructions using ART with blobs

Journal of Structural Biology, 2009

Multi-threading Multi-core Three-dimensional electron microscopy Single particle electron microscopy ART a b s t r a c t Three-dimensional electron microscopy allows direct visualization of biological macromolecules close to their native state. The high impact of this technique in the structural biology field is highly correlated with the development of new image processing algorithms. In order to achieve subnanometer resolution, the size and number of images involved in a three-dimensional reconstruction increase and so do computer requirements. New chips integrating multiple processors are hitting the market at a reduced cost. This high-integration, low-cost trend has just begun and is expected to bring real supercomputers to our laboratory desktops in the coming years. This paper proposes a parallel implementation of a computation-intensive algorithm for three-dimensional reconstruction, ART, that takes advantage of the computational power in modern multicore platforms. ART is a sophisticated iterative reconstruction algorithm that has turned out to be well suited for the conditions found in threedimensional electron microscopy. In view of the performance obtained in this work, these modern platforms are expected to play an important role to face the future challenges in three-dimensional electron microscopy.

Grid computing in image analysis

Diagnostic pathology, 2011

Diagnostic surgical pathology or tissue–based diagnosis still remains the most reliable and specific diagnostic medical procedure. The development of whole slide scanners permits the creation of virtual slides and to work on so-called virtual microscopes. In addition to interactive work on virtual slides approaches have been reported that introduce automated virtual microscopy, which is composed of several tools focusing on quite different tasks. These include evaluation of image quality and image standardization, analysis of potential useful thresholds for object detection and identification (segmentation), dynamic segmentation procedures, adjustable magnification to optimize feature extraction, and texture analysis including image transformation and evaluation of elementary primitives. Grid technology seems to possess all features to efficiently target and control the specific tasks of image information and detection in order to obtain a detailed and accurate diagnosis. Grid techn...

Toward an Automated HPC Pipeline for Processing Large Scale Electron Microscopy Data

2020 IEEE/ACM 2nd Annual Workshop on Extreme-scale Experiment-in-the-Loop Computing (XLOOP)

We present a fully modular and scalable software pipeline for processing electron microscope (EM) images of brain slices into 3D visualization of individual neurons and demonstrate an end-to-end segmentation of a large EM volume using a supercomputer. Our pipeline scales multiple packages used by the EM community with minimal changes to the original source codes. We tested each step of the pipeline individually, on a workstation, a cluster, and a supercomputer. Furthermore, we can compose workflows from these operations using a Balsam database that can be triggered during the data acquisition or with the use of different front ends and control the granularity of the pipeline execution. We describe the implementation of our pipeline and modifications required to integrate and scale up existing codes. The modular nature of our environment enables diverse research groups to contribute to the pipeline without disrupting the workflow, i.e. new individual codes can be easily integrated for each step on the pipeline.

Parallel, distributed and GPU computing technologies in single-particle electron microscopy

Acta Crystallographica Section D-biological Crystallography, 2009

Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

XMIPP: a new generation of an open-source image processing package for electron microscopy

X-windows based microscopy image processing package (Xmipp) is a specialized suit of image processing programs, primarily aimed at obtaining the 3D reconstruction of biological specimens from large sets of projection images acquired by transmission electron microscopy. This public-domain software package was introduced to the electron microscopy field eight years ago, and since then it has changed drastically. New methodologies for the analysis of single-particle projection images have been added to classification, contrast transfer function correction, angular assignment, 3D reconstruction, reconstruction of crystals, etc. In addition, the package has been extended with functionalities for 2D crystal and electron tomography data. Furthermore, its current implementation in C++, with a highly modular design of well-documented data structures and functions, offers a convenient environment for the development of novel algorithms. In this paper, we present a general overview of a new generation of Xmipp that has been re-engineered to maximize flexibility and modularity, potentially facilitating its integration in future standardization efforts in the field. Moreover, by focusing on those developments that distinguish Xmipp from other packages available, we illustrate its added value to the electron microscopy community.

Xmipp: An Image Processing Package for Electron Microscopy

Journal of Structural Biology, 1996

We present in this article a specialized suite of image processing programs primarily aimed at obtaining the three-dimensional reconstruction of biological specimens from large sets of projection images obtained by transmission electron microscopy. The programs are written in ANSI-C and use X-Windows for graphical output. A number of parallel extensions using PVM and PARMACS are provided. The programs are freely available by anonymous ftp at ftp.cnb.uam.es.

High performance computing environment for multidimensional image analysis

BMC Cell Biology, 2007

The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications.

SPIDER and WEB: Processing and Visualization of Images in 3D Electron Microscopy and Related Fields

Journal of Structural Biology, 1996

The SPIDER system has evolved into a comprehensive tool set for image processing, making use of modern graphics interfacing in the VMS and UNIX environment. SPIDER and WEB handle the complementary tasks of batch processing and visualization of the results. The emphasis of the SPIDER system remains in the area of single particle averaging and reconstruction, although a variety of other application areas have been added. Novel features are a suite of operations relating to the determination, modeling, and correction of the contrast transfer function and the availability of the entire documentation in hypertext format.