Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue - PubMed (original) (raw)

Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue

Christopher S Bjornsson et al. J Neurosci Methods. 2008.

Abstract

Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic 'divide and conquer' methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick ( approximately 100 microm) slices of rat brain tissue were labeled using three to five fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81 to 92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system.

PubMed Disclaimer

Figures

Figure 1

Figure 1. Maximum intensity projection of a 5-channel 3D image collected from a 100 μ_m_-thick section of rat hippocampus

The microscope collects a 32-point emission spectrum at each point. After linear unmixing, separate channels were obtained computationally for (a) CyQuant-labeled cell nuclei (cyan; small arrow head points to a pyramidal cell nucleus; the large arrowhead points to an endothelial cell nucleus), (b) NeuroTrace-labeled Nissl substance (purple) (c) Iba1-labeled microglia (yellow), (d) GFAP-labeled astrocytes (red), and (e) EBA-labeled blood vessels (green). (f) Combined projection illustrating a rich data set describing the position and morphology of the various nuclei and cell types and vasculature. (g) Excitation spectra. The vertical lines indicate the laser lines used for excitation (stars). (h) Emission spectra. Careful optimization was required to achieve the full dynamic range for each of the five fluorphores. Linear unmixing was able to separate each fluorophore despite considerable overlap in emission spectra. Scale bar = 50 μm.

Figure 2

Figure 2. Flowchart summary of the main image processing steps

The Zeiss LSM META microscope acquires a 32-point emission spectrum at each voxel. These data are spectrally unmixed to yield a 5-channel 3D image. Separate algorithms are used to segment nuclei and vessel surfaces, and to trace processes of astrocytes and microglia. Intrinsic and associative measurements are computed after the segmentations are validated. The initial unsupervised classification produces a training set that is inspected and edited to compute a SVM classifier that generates the final classifications. The final results are saved as an extensible markup language (XML) file for further analysis.

Figure 3

Figure 3. Automated segmentation results for individual channels, and computation of some associative features

All images are 2D projections of 3D datasets. The field is ~195 μm wide for the lower-magnification representations (a, b, c, & e). (a) Nuclear segmentation from CyQuant channel. Each segmented (but unclassified) nucleus is assigned a unique identification number. (b) Automated tracing of microglial processes and segmentation of cell bodies from Iba1 channel, shown in yellow over the image data that is shown in grayscale. (c) Automated segmentation of the vasculature from EBA channel. (d) Enlarged view of a representative segment (red box in c). The triangulated 3D mesh representation of the vessel surface can be seen. (e) Automated traces of astrocyte processes from GFAP channel displayed in red over a grayscale representation of the image data. (f) Enlarged view of a representative region (blue box in e). The calculated convergence point of the GFAP cytoskeleton for one cell is indicated with a green cross. (g) Illustrates the zone around a segmented nucleus for summing the Iba1 signal for identifying microglial nuclei. (h) Illustrates the zone surrounding a different segmented nucleus for identifying NeutoTrace-positive nuclei. (i) Illustrates the computation of a segmented nucleus’ proximity to a vessel segment for identifying endothelial cells. (j) Illustrates the proximity of the convergence point of astrocyte processes (green cross) to a segmented nucleus for identifying astrocytes.

Figure 4

Figure 4. Illustrating the accuracy of cell classification (upper row), and showing a 3D rendering of all segmentation and classification results

**(a)**Automated classification results for all segmented nuclei (neurons are magenta, astrocytes red, microglia yellow, and endothelial cells are green). (b) Manual classification resulting from the nucleus-by-nucleus validation of the entire dataset. (c) Difference map illustrating specific nuclei that were misclassified in comparison to the validation results. The misclassified objects were colored according to the investigator indicated results. (d) Composite 3D rendering of validated cell segmentation and classification results, astrocyte processes (red lines), microglial processes (yellow lines), and vessel segments (aqua surfaces). This rendering demonstrates the density of information extracted by FARSIGHT and enables qualitative assessment of cell distributions and their relationships with each other.

Figure 5

Figure 5. Associative analysis of FARSIGHT-generated cell segmentation and classification results

(a) A graph describing a six nearest-neighbor analysis of neurons in the dataset. The lines represent the neuron-to-neuron associations. Differences in line length are partly due to foreshortening that occurs when the 3D maps are projected for a 2D illustration. (b) Frequency histogram of the average nearest neighbor distances showing two populations. The major population includes neurons in the denser CA1 pyramidal cell layer in the lower right. The minor population includes neurons in the extra-pyramidal cell region. (c) The neurons in the extra-pyramidal region are displayed in dark purple to distinguish them from the neurons in the pyramidal cell layer. (d) Map of distances from vascular elements. The mapped distances were the shorter distance to either a segmented EBA-labeled vascular element or a validated vascular nucleus. Distances to vascular elements of each voxel are indicated in grayscale, black being closest to a vascular element, white being at the greatest distance. The apparent differences in the distances depicted in the figure are the result of foreshortening due to the 2D projection of the 3D map. (e and f) Frequency histograms describing cell distributions in the pyramidal cell layer and extra-pyramidal region. Data were plotted as the distance from each cell to the nearest vascular unit. Measurements from the vascular distance map (d) were used to determine the shortest distance between neurons, astrocytes, and microglia and the nearest vascular element. The volumes of the dataset containing the pyramidal cell layer and extra-pyramidal region were estimated using the analysis of neuron distributions (c). These data indicate that, overall, cells in the pyramidal cell layer are closer to vascular elements (mode ~ 30 μm) than cells in the extra-pyramidal region (mode ~45 μm). (Note: The data used for this analysis were selected to eliminate edge-sampling errors.) See Table 2 for more complete analysis.

Figure 6

Figure 6. Graphical interface for inspection and editing of segmentation & classification results

(a) Screen view of the user interface illustrating through-focus projections in xy (main panel) with segmented nuclei shown as white outlines, yz (immediate lower panel) and xz (immediate right panel) projections, drop down menus showing choices on how objects can be visited, validation type (lower most left panel), and measurements for a selected object (lower most right panel). (b) Screen capture of the validation window illustrating one selected nucleus from the main panel (yellow box and arrow). The slider on the immediate right side of the slice image indicates the optical section being viewed. The slider below the image indicates relative magnification. The buttons below the image allow editing of segmentation and classification. Each object can be viewed alone or in relation to other objects through every optical section of the 3D image stack. (c) Illustrating observed segmentation errors, including (i) hyper-segmentation, (ii) under-segmentation, and (iii) one object encroaching on another. (d) Tabular summary of nuclear segmentation errors for 3 different datasets for each cell type. (e – g) Projections of 3 datasets that were validated. Boxes indicate regions that were randomly selected for intensive validation.

Similar articles

Cited by

References

    1. Abdul-Karim MA, Al-Kofahi K, Brown EB, Jain RK, Roysam B. Automated tracing and change analysis of angiogenic vasculature from in vivo multiphoton confocal image time series. Microvasc Res. 2003;66(2):113–25. - PubMed
    1. Al-Kofahi K, Lasek S, Szarowski DH, Dowell-Mesfin N, Shain W, Turner JN, Roysam B. Median-based robust algorithms for tracing neurons from noisy confocal microscope images. IEEE Trans Inf Technol Biomed. 2003;7:302–317. - PubMed
    1. Al-Kofahi Y, Dowell-Mesfin D, Pace C, Shain W, Turner JN, Roysam B. Improved detection od branching points in algorithms for automated neuron tracing from 3D confocal images. Cytometry A. 2007 in press. - PubMed
    1. Bear M. Neuroscience: Exploring the Brain. 3. Lippincott Williams & Wilkens; New York: 2006.
    1. Bezdek J. Pattern recognition with fuzzy objective function algorithms. Plenum; New York: 1981.

MeSH terms

Substances

Grants and funding

LinkOut - more resources