Arshad Khan - Academia.edu (original) (raw)
Papers by Arshad Khan
Genome Research, 2002
To facilitate high-throughput 3D imaging of brain gene expression, a new method called voxelation... more To facilitate high-throughput 3D imaging of brain gene expression, a new method called voxelation has been developed. Spatially registered voxels (cubes) are analyzed, resulting in multiple volumetric maps of gene expression analogous to the images reconstructed in biomedical imaging systems. Using microarrays, 40 voxel images for 9000 genes were acquired from brains of both normal mice and mice in which a pharmacological model of Parkinson's disease (PD) had been induced by methamphetamine. Quality-control analyses established the reproducibility of the voxelation procedure. The investigation revealed a common network of coregulated genes shared between the normal and PD brain, and allowed identification of putative control regions responsible for these networks. In addition, genes involved in cell/cell interactions were found to be prominently regulated in the PD brains. Finally, singular value decomposition (SVD), a mathematical method used to provide parsimonious explanations of complex data sets, identified gene vectors and their corresponding images that distinguished between normal and PD brain structures, most pertinently the striatum.
Journal of Proteome Research, 2006
Given the growing interest in applying genomic and proteomic approaches for studying the mammalia... more Given the growing interest in applying genomic and proteomic approaches for studying the mammalian brain using mouse models, we hereby present a global proteomic approach for analyzing brain tissue and for the first time a comprehensive characterization of the whole mouse brain proteome. Preparation of the whole brain sample incorporated a highly efficient cysteinyl-peptide enrichment (CPE) technique to complement a global enzymatic digestion method. Both the global and the cysteinyl-enriched peptide samples were analyzed by SCX fractionation coupled with reversed phase LC-MS/MS analysis. A total of 48,328 different peptides were confidently identified (>98% confidence level), covering 7792 non-redundant proteins (∼34% of the predicted mouse proteome). 1564 and 1859 proteins were identified exclusively from the cysteinyl-peptide and the global peptide samples, respectively, corresponding to 25% and 31% improvements in proteome coverage compared to analysis of only the global peptide or cysteinyl-peptide samples. The identified proteins provide a broad representation of the mouse proteome with little bias evident due to protein pI, molecular weight, and/or cellular localization. Approximately 26% of the identified proteins with gene ontology (GO) annotations were membrane proteins, with 1447 proteins predicted to have transmembrane domains, and many of the membrane proteins were found to be involved in transport and cell signaling. The MS/MS spectrum count information for the identified proteins was used to provide a measure of relative protein abundances. The mouse brain peptide/protein database generated from this study represents the most comprehensive proteome coverage for the mammalian brain to date, and the basis for future quantitative brain proteomic studies using mouse models. The proteomic approach presented here may have broad applications for rapid proteomic analyses of various mouse models of human brain diseases.
Genomics, 2003
We describe a microarray design based on the concept of error-correcting codes from digital commu... more We describe a microarray design based on the concept of error-correcting codes from digital communication theory. Currently, microarrays are unable to efficiently deal with "drop-outs," when one or more spots on the array are corrupted. The resulting information loss may lead to decoding errors in which no quantitation of expression can be extracted for the corresponding genes. This issue is expected to become increasingly problematic as the number of spots on microarrays expands to accommodate the entire genome. The error-correcting approach employs multiplexing (encoding) of more than one gene onto each spot to efficiently provide robustness to drop-outs in the array. Decoding then allows fault-tolerant recovery of the expression information from individual genes. The error-correcting method is general and may have important implications for future array designs in research and diagnostics.
Journal of Proteome Research, 2005
Challenges associated with the efficient and effective preparation of micro-and nano-scale (micro... more Challenges associated with the efficient and effective preparation of micro-and nano-scale (microand nano-gram) clinical specimens for proteomic applications include the unmitigated sample losses that occur during the processing steps. Herein we describe a simple "single-tube" preparation protocol appropriate for small proteomic samples using the organic co-solvent, trifluoroethanol (TFE) that circumvents the loss of sample by facilitating both protein extraction and protein denaturation without requiring a separate cleanup step. The performance of the TFE-based method was initially evaluated by comparisons to traditional detergent-based methods on relatively large scale sample processing using human breast cancer cells and mouse brain tissue. The results demonstrated that the TFE-based protocol provided comparable results to the traditional detergent-based protocols for larger, conventionally-sized proteomic samples (>100 μg protein content), based on both sample recovery and peptide/protein identifications. The effectiveness of this protocol for micro-and nano-scale sample processing was then evaluated for the extraction of proteins/peptides and shown effective for small mouse brain tissue samples (∼30 μg total protein content) and also for samples of ∼5 000 MCF-7 human breast cancer cells (∼500 ng total protein content), where the detergent-based methods were ineffective due to losses during cleanup and transfer steps.
Genome Research, 2002
Voxelation is a new method for acquisition of three dimensional (3D) gene expression patterns in ... more Voxelation is a new method for acquisition of three dimensional (3D) gene expression patterns in the brain. It employs high-throughput analysis of spatially registered voxels (cubes) to produce multiple volumetric maps of gene expression analogous to the images reconstructed in biomedical imaging systems. Using microarrays, 24 voxel images of coronal hemisections at the level of the hippocampus of both the normal human brain and Alzheimer's disease brain were acquired for 2000 genes. The analysis revealed a common network of coregulated genes, and allowed identification of putative control regions. In addition, singular value decomposition (SVD), a mathematical method used to provide economical explanations of complex data sets, produced images that distinguished between brain structures, including cortex, caudate, and hippocampus. The results suggest that voxelation will be a useful approach for understanding how the genome constructs the brain.
Journal of Neuroscience Methods, 2003
Voxelation allows high-throughput acquisition of multiple volumetric images of brain gene express... more Voxelation allows high-throughput acquisition of multiple volumetric images of brain gene expression, similar to those obtained from biomedical imaging systems. To obtain these images, the method employs analysis of spatially registered voxels (cubes). For creation of high-resolution maps using voxelation, relatively small voxel sizes are necessary and instruments will be required for semiautomated harvesting of such voxels. Here, we describe two devices that allow spatially registered harvesting of voxels from the human and rodent brain, giving linear resolutions of 3.3 and 1 mm, respectively. Gene expression patterns obtained using these devices showed good agreement with known expression patterns. The voxelation instruments and their future iterations represent a valuable approach to the genome scale acquisition of gene expression patterns in the human and rodent brain. #
Omega-international Journal of Management Science, 1986
Venture capitalists are experts who specialize in evaluating new business ventures. When asked ho... more Venture capitalists are experts who specialize in evaluating new business ventures. When asked how, they stress the overwhelming importance of the entrepreneur. In this paper, entrepreneur characteristics are identified and then examined for predictive potential. Models are developed on the basis of venture capitalists' judgments for these characteristics and for the level of venture success. A second set of models is based on actual venture outcome as the criterion. Lastly, the performance of the models and expert judgments are compared.
Genome Research, 2002
To facilitate high-throughput 3D imaging of brain gene expression, a new method called voxelation... more To facilitate high-throughput 3D imaging of brain gene expression, a new method called voxelation has been developed. Spatially registered voxels (cubes) are analyzed, resulting in multiple volumetric maps of gene expression analogous to the images reconstructed in biomedical imaging systems. Using microarrays, 40 voxel images for 9000 genes were acquired from brains of both normal mice and mice in which a pharmacological model of Parkinson's disease (PD) had been induced by methamphetamine. Quality-control analyses established the reproducibility of the voxelation procedure. The investigation revealed a common network of coregulated genes shared between the normal and PD brain, and allowed identification of putative control regions responsible for these networks. In addition, genes involved in cell/cell interactions were found to be prominently regulated in the PD brains. Finally, singular value decomposition (SVD), a mathematical method used to provide parsimonious explanations of complex data sets, identified gene vectors and their corresponding images that distinguished between normal and PD brain structures, most pertinently the striatum.
Journal of Proteome Research, 2006
Given the growing interest in applying genomic and proteomic approaches for studying the mammalia... more Given the growing interest in applying genomic and proteomic approaches for studying the mammalian brain using mouse models, we hereby present a global proteomic approach for analyzing brain tissue and for the first time a comprehensive characterization of the whole mouse brain proteome. Preparation of the whole brain sample incorporated a highly efficient cysteinyl-peptide enrichment (CPE) technique to complement a global enzymatic digestion method. Both the global and the cysteinyl-enriched peptide samples were analyzed by SCX fractionation coupled with reversed phase LC-MS/MS analysis. A total of 48,328 different peptides were confidently identified (>98% confidence level), covering 7792 non-redundant proteins (∼34% of the predicted mouse proteome). 1564 and 1859 proteins were identified exclusively from the cysteinyl-peptide and the global peptide samples, respectively, corresponding to 25% and 31% improvements in proteome coverage compared to analysis of only the global peptide or cysteinyl-peptide samples. The identified proteins provide a broad representation of the mouse proteome with little bias evident due to protein pI, molecular weight, and/or cellular localization. Approximately 26% of the identified proteins with gene ontology (GO) annotations were membrane proteins, with 1447 proteins predicted to have transmembrane domains, and many of the membrane proteins were found to be involved in transport and cell signaling. The MS/MS spectrum count information for the identified proteins was used to provide a measure of relative protein abundances. The mouse brain peptide/protein database generated from this study represents the most comprehensive proteome coverage for the mammalian brain to date, and the basis for future quantitative brain proteomic studies using mouse models. The proteomic approach presented here may have broad applications for rapid proteomic analyses of various mouse models of human brain diseases.
Genomics, 2003
We describe a microarray design based on the concept of error-correcting codes from digital commu... more We describe a microarray design based on the concept of error-correcting codes from digital communication theory. Currently, microarrays are unable to efficiently deal with "drop-outs," when one or more spots on the array are corrupted. The resulting information loss may lead to decoding errors in which no quantitation of expression can be extracted for the corresponding genes. This issue is expected to become increasingly problematic as the number of spots on microarrays expands to accommodate the entire genome. The error-correcting approach employs multiplexing (encoding) of more than one gene onto each spot to efficiently provide robustness to drop-outs in the array. Decoding then allows fault-tolerant recovery of the expression information from individual genes. The error-correcting method is general and may have important implications for future array designs in research and diagnostics.
Journal of Proteome Research, 2005
Challenges associated with the efficient and effective preparation of micro-and nano-scale (micro... more Challenges associated with the efficient and effective preparation of micro-and nano-scale (microand nano-gram) clinical specimens for proteomic applications include the unmitigated sample losses that occur during the processing steps. Herein we describe a simple "single-tube" preparation protocol appropriate for small proteomic samples using the organic co-solvent, trifluoroethanol (TFE) that circumvents the loss of sample by facilitating both protein extraction and protein denaturation without requiring a separate cleanup step. The performance of the TFE-based method was initially evaluated by comparisons to traditional detergent-based methods on relatively large scale sample processing using human breast cancer cells and mouse brain tissue. The results demonstrated that the TFE-based protocol provided comparable results to the traditional detergent-based protocols for larger, conventionally-sized proteomic samples (>100 μg protein content), based on both sample recovery and peptide/protein identifications. The effectiveness of this protocol for micro-and nano-scale sample processing was then evaluated for the extraction of proteins/peptides and shown effective for small mouse brain tissue samples (∼30 μg total protein content) and also for samples of ∼5 000 MCF-7 human breast cancer cells (∼500 ng total protein content), where the detergent-based methods were ineffective due to losses during cleanup and transfer steps.
Genome Research, 2002
Voxelation is a new method for acquisition of three dimensional (3D) gene expression patterns in ... more Voxelation is a new method for acquisition of three dimensional (3D) gene expression patterns in the brain. It employs high-throughput analysis of spatially registered voxels (cubes) to produce multiple volumetric maps of gene expression analogous to the images reconstructed in biomedical imaging systems. Using microarrays, 24 voxel images of coronal hemisections at the level of the hippocampus of both the normal human brain and Alzheimer's disease brain were acquired for 2000 genes. The analysis revealed a common network of coregulated genes, and allowed identification of putative control regions. In addition, singular value decomposition (SVD), a mathematical method used to provide economical explanations of complex data sets, produced images that distinguished between brain structures, including cortex, caudate, and hippocampus. The results suggest that voxelation will be a useful approach for understanding how the genome constructs the brain.
Journal of Neuroscience Methods, 2003
Voxelation allows high-throughput acquisition of multiple volumetric images of brain gene express... more Voxelation allows high-throughput acquisition of multiple volumetric images of brain gene expression, similar to those obtained from biomedical imaging systems. To obtain these images, the method employs analysis of spatially registered voxels (cubes). For creation of high-resolution maps using voxelation, relatively small voxel sizes are necessary and instruments will be required for semiautomated harvesting of such voxels. Here, we describe two devices that allow spatially registered harvesting of voxels from the human and rodent brain, giving linear resolutions of 3.3 and 1 mm, respectively. Gene expression patterns obtained using these devices showed good agreement with known expression patterns. The voxelation instruments and their future iterations represent a valuable approach to the genome scale acquisition of gene expression patterns in the human and rodent brain. #
Omega-international Journal of Management Science, 1986
Venture capitalists are experts who specialize in evaluating new business ventures. When asked ho... more Venture capitalists are experts who specialize in evaluating new business ventures. When asked how, they stress the overwhelming importance of the entrepreneur. In this paper, entrepreneur characteristics are identified and then examined for predictive potential. Models are developed on the basis of venture capitalists' judgments for these characteristics and for the level of venture success. A second set of models is based on actual venture outcome as the criterion. Lastly, the performance of the models and expert judgments are compared.