George Ostrouchov - Academia.edu (original) (raw)

Uploads

Papers by George Ostrouchov

Research paper thumbnail of Extending Skel to Support the Development and Optimization of Next Generation I/O Systems

2017 IEEE International Conference on Cluster Computing (CLUSTER)

Research paper thumbnail of Large sparse least squares computations

Research paper thumbnail of An evaluation of the state of time synchronization on leadership class supercomputers

Concurrency and Computation: Practice and Experience

Research paper thumbnail of Introducing a New Client/Server Framework for Big Data Analytics with the R Language

Proceedings of the XSEDE16 on Diversity, Big Data, and Science at Scale - XSEDE16, 2016

Research paper thumbnail of Parallel Model-Based Clustering

Research paper thumbnail of �Ç�ÍÅ�ÆÌ �Î�ÁÄ��ÁÄÁÌ� Ê�ÔÓÖØ × ÔÖÓ�Ù � � ��Ø�Ö Â�ÒÙ�ÖÝ �� � �Ö � ��Ò�Ö�ÐÐÝ �Ú��Ð��Ð � �Ö� � Ú� � Ø��

Research paper thumbnail of Programming with Big Data – Interface to Parallel UnidataNetCDF4 Format Data Files

Research paper thumbnail of Uncertainty Analysis of a Heavily Instrumented Building at Different Scales of Simulation

Simulation plays a big role in understanding the behavior of building envelopes. With the increas... more Simulation plays a big role in understanding the behavior of building envelopes. With the increasing availability of computational resources, it is feasible to conduct parametric simulations for applications such as software model calibration, building control optimization, or fault detection and diagnostics. In this paper, we present an uncertainty exploration of two types of buildings: a) of a building envelope's thermal conductivity properties for a heavily instrumented residential building involving more than 200 sensors, and b) a sensitivity analysis of a stand-alone retail building from the U.S. Department of Energy's reference model. A total of 156 input parameters were determined to be important by experts which were then varied using a Markov Order process for the residential building generating hundreds of GBs of data for tens of thousands of simulations. For the commercial building, 20 parameters were varied using a fractional factorial design requiring just 1024 simulations generating data in the order of a few hundred megabytes. These represent a wide variety and range of simulations from a few to tens of thousands of simulations in an ensemble. Depending on the number of simulations in an ensemble, the techniques employed to meaningfully make sense of the information can be very different, and potentially challenging. Additionally, the method of analysis almost always depends on the experimental design. The Markov Order sampling strategy and fractional factorials designs of sampling presented represent two approaches one could employ for large sensitivity analysis of buildings at two different scales of simulations. The paper presents the analysis using descriptive statistics as well as employing multiple analysis of variance techniques for comparison and contrast.

Research paper thumbnail of Distributed Dimension Reduction Algorithms for Widely Dispersed Data

Research paper thumbnail of A Matrix Computation View of FastMap and RobustMap Dimension Reduction Algorithms

Siam Journal on Matrix Analysis and Applications, 2009

Research paper thumbnail of Programming with Big Data – Demonstrations of pbd Packages

Research paper thumbnail of Programming with Big Data — MPI Profiling Tools

Research paper thumbnail of Programming with Big Data – Interface to MPI

Research paper thumbnail of Seeing the Unseeable

Scidac Review, May 29, 2008

The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly produc... more The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly productive effort combining the forces of leading visualization researchersfrom five different institutions to solve some of the most challenging dataunderstanding problems in modern science. ...

Research paper thumbnail of Programming with Big Data – Base Wrappers for DistributedMatrices

Research paper thumbnail of Programming with Big Data – Scalable Linear Algebra Packages

Research paper thumbnail of Spatial Statistical Models and Optimal Survey Design for Rapid Geophysical characterization of UXO Sites

Research paper thumbnail of Dose estimation from daily and weekly dosimetry data: Final draft

Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been... more Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been based on recorded annual radiation doses (yearly dose of record). It is usually assumed that the dose values are known exactly, although it is generally recognized that the data contain uncertainty due to measurement error and bias. In our previous work with weekly data, a probability distribution was used to describe an individual's dose during a specific period of time and statistical methods were developed for estimating it from weekly film dosimetry data. This study showed that the yearly dose of record systematically underestimates doses for Oak Ridge National Laboratory (ORNL) workers. This could result in biased estimates of doseresponse coefficients and their standard errors. The results of this evaluation raise serious questions about the suitability of the yearly dose of record for direct use in low-dose studies of nuclear industry workers. Here, we extend our previous work to use full information in Pocket meter data and develop the Data Synthesis for Individual Dose Estimation (DSIDE) methodology. Although the DSIDE methodology in this study is developed in the context of daily and weekly data to produce a cumulative yearly dose estimate, in principle it is completely general and can be extended to other time period and measurement combinations. The new methodology takes into account the "measurement error" that is produced by the film and pocketmeter dosimetry systems, the biases introduced by policies that lead to recording left-censored doses as zeros, and other measurement and recording practices. The DSIDE method is applied to a sample of dose histories obtained from hard copy dosimetry records at ORNL for the years 1945 to 1955. First, the rigorous addition of daily pocket-meter information shows that the negative bias is generally more severe than was reported in our work based on weekly film data only, however, the amount of bias also varies greatly between person-years. Second, the addition of pocket-meter information reduces uncertainty for some person-years, while increasing it for others. Together, these results suggest that detailed pocket-meter and film dosimetry information is required to obtain unbiased and reliable dosimetry data for use in epidemiologic studies of workers at ORNL.

Research paper thumbnail of A model search procedure for hierarchical log-linear models

Research paper thumbnail of �����Xmap: Fast Dimension Reduction Algorithm for Multivariate Streamline Data �����������

Research paper thumbnail of Extending Skel to Support the Development and Optimization of Next Generation I/O Systems

2017 IEEE International Conference on Cluster Computing (CLUSTER)

Research paper thumbnail of Large sparse least squares computations

Research paper thumbnail of An evaluation of the state of time synchronization on leadership class supercomputers

Concurrency and Computation: Practice and Experience

Research paper thumbnail of Introducing a New Client/Server Framework for Big Data Analytics with the R Language

Proceedings of the XSEDE16 on Diversity, Big Data, and Science at Scale - XSEDE16, 2016

Research paper thumbnail of Parallel Model-Based Clustering

Research paper thumbnail of �Ç�ÍÅ�ÆÌ �Î�ÁÄ��ÁÄÁÌ� Ê�ÔÓÖØ × ÔÖÓ�Ù � � ��Ø�Ö Â�ÒÙ�ÖÝ �� � �Ö � ��Ò�Ö�ÐÐÝ �Ú��Ð��Ð � �Ö� � Ú� � Ø��

Research paper thumbnail of Programming with Big Data – Interface to Parallel UnidataNetCDF4 Format Data Files

Research paper thumbnail of Uncertainty Analysis of a Heavily Instrumented Building at Different Scales of Simulation

Simulation plays a big role in understanding the behavior of building envelopes. With the increas... more Simulation plays a big role in understanding the behavior of building envelopes. With the increasing availability of computational resources, it is feasible to conduct parametric simulations for applications such as software model calibration, building control optimization, or fault detection and diagnostics. In this paper, we present an uncertainty exploration of two types of buildings: a) of a building envelope's thermal conductivity properties for a heavily instrumented residential building involving more than 200 sensors, and b) a sensitivity analysis of a stand-alone retail building from the U.S. Department of Energy's reference model. A total of 156 input parameters were determined to be important by experts which were then varied using a Markov Order process for the residential building generating hundreds of GBs of data for tens of thousands of simulations. For the commercial building, 20 parameters were varied using a fractional factorial design requiring just 1024 simulations generating data in the order of a few hundred megabytes. These represent a wide variety and range of simulations from a few to tens of thousands of simulations in an ensemble. Depending on the number of simulations in an ensemble, the techniques employed to meaningfully make sense of the information can be very different, and potentially challenging. Additionally, the method of analysis almost always depends on the experimental design. The Markov Order sampling strategy and fractional factorials designs of sampling presented represent two approaches one could employ for large sensitivity analysis of buildings at two different scales of simulations. The paper presents the analysis using descriptive statistics as well as employing multiple analysis of variance techniques for comparison and contrast.

Research paper thumbnail of Distributed Dimension Reduction Algorithms for Widely Dispersed Data

Research paper thumbnail of A Matrix Computation View of FastMap and RobustMap Dimension Reduction Algorithms

Siam Journal on Matrix Analysis and Applications, 2009

Research paper thumbnail of Programming with Big Data – Demonstrations of pbd Packages

Research paper thumbnail of Programming with Big Data — MPI Profiling Tools

Research paper thumbnail of Programming with Big Data – Interface to MPI

Research paper thumbnail of Seeing the Unseeable

Scidac Review, May 29, 2008

The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly produc... more The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly productive effort combining the forces of leading visualization researchersfrom five different institutions to solve some of the most challenging dataunderstanding problems in modern science. ...

Research paper thumbnail of Programming with Big Data – Base Wrappers for DistributedMatrices

Research paper thumbnail of Programming with Big Data – Scalable Linear Algebra Packages

Research paper thumbnail of Spatial Statistical Models and Optimal Survey Design for Rapid Geophysical characterization of UXO Sites

Research paper thumbnail of Dose estimation from daily and weekly dosimetry data: Final draft

Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been... more Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been based on recorded annual radiation doses (yearly dose of record). It is usually assumed that the dose values are known exactly, although it is generally recognized that the data contain uncertainty due to measurement error and bias. In our previous work with weekly data, a probability distribution was used to describe an individual's dose during a specific period of time and statistical methods were developed for estimating it from weekly film dosimetry data. This study showed that the yearly dose of record systematically underestimates doses for Oak Ridge National Laboratory (ORNL) workers. This could result in biased estimates of doseresponse coefficients and their standard errors. The results of this evaluation raise serious questions about the suitability of the yearly dose of record for direct use in low-dose studies of nuclear industry workers. Here, we extend our previous work to use full information in Pocket meter data and develop the Data Synthesis for Individual Dose Estimation (DSIDE) methodology. Although the DSIDE methodology in this study is developed in the context of daily and weekly data to produce a cumulative yearly dose estimate, in principle it is completely general and can be extended to other time period and measurement combinations. The new methodology takes into account the "measurement error" that is produced by the film and pocketmeter dosimetry systems, the biases introduced by policies that lead to recording left-censored doses as zeros, and other measurement and recording practices. The DSIDE method is applied to a sample of dose histories obtained from hard copy dosimetry records at ORNL for the years 1945 to 1955. First, the rigorous addition of daily pocket-meter information shows that the negative bias is generally more severe than was reported in our work based on weekly film data only, however, the amount of bias also varies greatly between person-years. Second, the addition of pocket-meter information reduces uncertainty for some person-years, while increasing it for others. Together, these results suggest that detailed pocket-meter and film dosimetry information is required to obtain unbiased and reliable dosimetry data for use in epidemiologic studies of workers at ORNL.

Research paper thumbnail of A model search procedure for hierarchical log-linear models

Research paper thumbnail of �����Xmap: Fast Dimension Reduction Algorithm for Multivariate Streamline Data �����������

Log In